1
votes

We are using Kubernetes and we have multiple tomcat/jws containers running on multiple pods. What would be best approach for centralized logging using fluentd, Elasticsearch and Kibana. The main purpose is to get the tomcat logs which are running in pods (example: access.log and catalina.log), also the application log which is deployed on the tomcat. Also we need to differentiate the logs coming from different pods (tomcat container). I followed below link https://access.redhat.com/documentation/en/red-hat-enterprise-linux-atomic-host/7/getting-started-with-containers/chapter-11-using-the-atomic-rsyslog-container-image From this I am only able to get container logs but not able to get tomcat log.

-Praveen

1

1 Answers

1
votes

have a look at this example:

https://github.com/kubernetes/contrib/tree/master/logging/fluentd-sidecar-es

The basic idea is to deploy an additional fluentd container in your pod and share a volume between the containers. The application container writes the logs into the volume and the fluentd container mounts the same volume readonly and feeds the logs to elasticsearch. In the default configuration the log events get a tag like "file.application.log".

We evaluate this setup at the moment but we have more application containers with the same logfile name. So there is still work todo.