3
votes

I am migrating from running my containers on a Docker Swarm cluster to Kubernetes running on Google Container Engine. When running on Docker Swarm, I had configured the Docker Engine's logging driver (https://docs.docker.com/engine/admin/logging/overview/) to forward logs in the Fluentd format to a Fluentd container running on the Docker Swarm node with a custom config that would then forward the Docker logs to both an Elasticsearch cluster (running Kibana), as well as an AWS S3 bucket. How do I port this over to my Kubernetes nodes?

I read that I can run my Fluentd container on each Node using a Daemon Set (https://kubernetes.io/docs/concepts/workloads/controllers/daemonset/), but I cannot find any documentation on configuring the Docker Engine log driver to forward the Docker logs to the Fluentd container, and furthermore, to format the logs in the format that I need.

1

1 Answers

0
votes

We used a bit another solution, we are running fluentd as daemonset but docker write logs to the journal and fluentd access them with systemd plugin. https://github.com/reevoo/fluent-plugin-systemd . Also we use fabric8 kubernet metadata plugin - https://github.com/fabric8io/fluent-plugin-kubernetes_metadata_filter Another approach is to is to use type tail and /var/log/containers/*.log for the path. Look in kubernetes_metadata_filter there are some examples.