1
votes

So I am trying to capture the output from docker containers running on a host but after a change by the developers to use json as a logging output for the containers I am missing out on the containers start up message that are happening in the entrypoint.sh. I can see that someone has added a new filter section in the config file which works really nicely to capture json output but only json output.

Here is the template in use:

<source>
  @type forward
  port 24224
  bind 0.0.0.0
  tag GELF_TAG
</source>

<filter GELF_TAG.**>
  @type parser
  key_name log
  reserve_data false
  <parse>
    @type json
  </parse>
</filter>

<match GELF_TAG.**>
  @type copy
  <store>
    @type gelf
    host {{ graylog_server_fqdn }}
    port 12201
    protocol tcp
    flush_interval 5s
  </store>
  <store>
    @type stdout
  </store>
</match>

How do I set up the config to be able to capture the entrypoint.sh output and the json output from the containers after they start?

EDIT.

The filter is rejecting messages sent to the docker containers stdout up until the application starts logging in json.

[warn]: #0 dump an error event: error_class=Fluent::Plugin::Parser::ParserError error="pattern not matched with data

So I tried to capture everything that was being drooped into the ERROR tag and I can see the missing messages but they still fail to parse using this config:

# Ansible
<source>
  @type forward
  port 24224
  bind 0.0.0.0
  tag GELF_TAG
</source>

<filter GELF_TAG.**>
  @type parser
  emit_invalid_record_to_error true
  key_name log
  reserve_data false
  <parse>
    @type json
  </parse>
</filter>

<match {GELF_TAG.**,@ERROR}>
  @type copy
  <store>
    @type gelf
    host {{ graylog_server_fqdn }}
    port 12201
    protocol tcp
    flush_interval 5s
  </store>
  <store>
    @type stdout
  </store>
</match>
2
So, the filter is rejecting the non-JSON logs, right?Azeem
Check emit_invalid_record_to_error. Might be helpful.Azeem
So I thought I would give that a go and I am now seeing all the missing data that is being dumped in the ERROR tag but the parser still fails.SnazzyBootMan
The data is there and the problem is the non-JSON data cannot be parsed as JSON. Can you add another source to parse non-JSON logs?Azeem
The data is all coming from a single source on port 24224. So I am not sure that would help.SnazzyBootMan

2 Answers

0
votes

Install the multi-format parser:

td-agent-gem install fluent-plugin-multi-format-parser -v 1.0.0

# Ansible
<source>
  @type forward
  port 24224
  bind 0.0.0.0
  tag GELF_TAG
</source>

<filter GELF_TAG.**>
  @type parser
  key_name log
  reserve_data false
  <parse>
    @type multi_format
    <pattern>
      format json
      time_key timestamp
    </pattern>
    <pattern>
      format none
    </pattern>
  </parse>
</filter>

<match GELF_TAG.**>
  @type copy
  <store>
    @type gelf
    host {{ graylog_server_fqdn }}
    port 12201
    protocol tcp
    flush_interval 5s
  </store>
  <store>
    @type stdout
  </store>
</match>
0
votes

You can also use the 'rewrite_tag_filter' which is an output plugin. Using that you can change the tag for the different patterns, and then use the parsers/filters.