7
votes

I have configured ELK-stack (Elasticsearch, Logstash, and Kibana) cluster for centralized logging system with Filebeat. Now I have been asked to reconfigure to EFK (Elasticsearch, FluentD, and Kibana) with Filebeat. I have disabled the Logstash and Installed FluentD, But I'm not able to configure FluentD with Filebeat. I have installed FluentD plugin for Filebeat and modified /etc/td-agent/td-agent.conf, but it seems not working.

td-agent.conf

<source>
  @type beats
  tag record['@metadata']['beat']
  port 5044
  bind 0.0.0.0
</source>

<match *.**>
  @type copy
   <store>
    #@type file
    @type elasticsearch_dynamic
    logstash_format true
    logstash_prefix ${tag_parts[0]}
    type_name ${record['type']}
  </store>
  <store>
    @type file
    logstash_format true
    logstash_prefix ${tag_parts[0]}
    type_name ${record['type']}
    path /var/log/td-agent/data_logs.*.log
 </store>
</match>
1
Did you ever figure this out? I have the same problem.jtlindsey

1 Answers

0
votes

A source is an input not an output in fluentd you would want a match with the corresponding fluentd tags that match to be shipped to filebeats then out to Elastic.