0
votes

input file: csv

eg: ~DEALER_ID~,~STOCK_ID~,~VIN~,~IS_NEW~,~IS_CERTIFIED~,~YEAR~,~MAKE~,~MODEL~

~1035659~,~C0524359~,~2LMDJ6J45454359~,~N~,~N~,~2013~,~Lincoln~,~MKX~

~1035659~,~C0532359~,~345666543344443~,~N~,~N~,~2016~,~BMW~,~X5~

...

...

Location: S3

As soon as the csv is dropped in S3, I would like the data to be ingested into AWS managed ElasticSearch. Im very new to ELK stack and AWS elastic search, so I'd like some suggestions on best way to get this working on AWS.

I was able to parse this file by running logstash locally and sending it to my local ElasticSearch and Kibana servers.

1

1 Answers

0
votes

Simple way would be to run a Lambda function and add a trigger such that when a new file arrives in S3 your function is triggered and Lambda ingests the data into AWS ES