8
votes

We have a setup with multiple containers running NodeJS services(node:11-alpine docker image) deployed in AWS ECS Fargate.

We already have a running ElasticSearch instance collecting logs from non-Fargate application. I would like to pass the logs from the Fargate containers into this ElasticSearch instance, but I have a hard time to figure out what is the best approach.

1) It seem one way is to stream the logs from Cloudwatch --> Lambda --> ElasticSearch. It seem a bit overkill - isn't there another way to do this?

2) I was hoping i could run a Logstash docker instance that could collect the logs from the containers but I am not sure if this is possible when running Fargate?

3) Should I install something like FileBeat on each container and let that send the logs?

Any help is appreciated.

1
How about leveraging FunctionBeat which is fit for this very purpose?Val

1 Answers

7
votes

1) It seems one way is to stream the logs from Cloudwatch --> Lambda --> ElasticSearch. It seem a bit overkill - isn't there another way to do this?

If you're looking for an AWS based managed solution, that is the right way. You don't really need to write a Lambda function, AWS does it for you and that doesn't seem like overkill as this is more managed and more AWS oriented approach. Also, what this solution is going to do for is going to be the same as Logstash.

2) I was hoping i could run a Logstash docker instance that could collect the logs from the containers but I am not sure if this is possible when running Fargate?

Yes, that is possible.

3) Should I install something like FileBeat on each container and let that send the logs?

You can use FileBeat, Fluentd, FunctionBeat or Logstash as you like.

Note: If you're thinking of running your own Logstash container, don't enable CloudWatch logging as well as you're not going to use that either. But I would recommend going for AWS based solution for that.