12
votes

I am trying to create a centralized logging system using fluentd for a docker environment. Currently, i able to send the docker log to fluentd using fluentd docker logging driver which is a much cleaner solution compare to reading the docker log file using in_tail method. However, i am currently facing the issue on multi lines log issue.

enter image description here

As you can see from the picture above, the multi lines log are out of order which is very confusing for user. Is there any way this can be solved?

Thanks.

Cw

3
Just to add some comments on this topic after i did some further research. The out of order issue is due to Fluentd time resolution (no sub second support now). Thanks to this answer link, i able to get the records display in order and at least user will not be that confuse when reading this log.cheng wee
For another solution to the milisecond issue, check this blog post work.haufegroup.io/log-aggregation/#timestamp-fixdutzu
Do you have a solution yet? I found this link fluentd.org/guides/recipes/docker-logging about merge multiline log in docker before it send to fluentd, but the implementation is very specific to the log format.Nextlink

3 Answers

3
votes

Using fluent-plugin-concat pluging helped me in fixing above problem.

Adding these lines in fluent-conf

 <filter **>
  @type concat
  key log
  stream_identity_key container_id
  multiline_start_regexp /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{3}
  multiline_end_regexp /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{3}
</filter>

Where my regular expression is checking for DateTimeStamp in Logs where each line starts with and Date and Timestamp (pay attention to "log":"2017-09-21 15:03:27.289) below

2017-09-21T15:03:27Z    tag     {"container_id":"11b0d89723b9c812be65233adbc51a71507bee04e494134258b7af13f089087f","container_name":"/bel_osc.1.bc1k2z6lke1d7djeq5s28xjyl","source":"stdout","log":"2017-09-21 15:03:27.289  INFO 1 --- [           main] org.apache.catalina.core.StandardEngine  : Starting Servlet Engine: Apache Tomcat/8.5.6"}
2017-09-21T15:03:28Z    tag     {"container_id":"11b0d89723b9c812be65233adbc51a71507bee04e494134258b7af13f089087f","container_name":"/bel_osc.1.bc1k2z6lke1d7djeq5s28xjyl","source":"stdout","log":"2017-09-21 15:03:28.191  INFO 1 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/]       : Initializing Spring embedded WebApplicationContext"}

Also, I had to add below lines in Dockerfile to install the plugin

RUN ["gem", "install", "fluent-plugin-concat", "--version", "2.1.0"] 
#Works with Fluentd v0.14-debian

Though this regular expression doesn't work well when an exception occurs, but still much better than before. Fluentd Link, for reference.

1
votes

Take a look at multiline parsing in their documentation: http://docs.fluentd.org/articles/parser-plugin-overview#

You basically have to specify a regex that would match the beginning of a new log message and that will enable fluentd to aggregate multiline log events into a single message.

Example for a usual java stacktrace from their docs:

format multiline format_firstline /\d{4}-\d{1,2}-\d{1,2}/ format1 /^(?<time>\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2}) \[(?<thread>.*)\] (?<level>[^\s]+)(?<message>.*)/

0
votes

I know this is not and "answer" to the fluentd question. But this guide solves the problem with logstash: http://www.labouisse.com/how-to/2015/09/14/elk-and-docker-1-8

JSON support by adding

    json {
        source => "log_message"
        target => "json"
    }

to his filter after parsing a log line

I never found a solution for fluentd, so went with this solution instead

Updated link