0
votes

I've setup a fluentd/elasticsearch/kibana stack very similar to what is described here. When I look at the logs in kibana I notice that they are automatically indexed by day using the format "logstash-[YYYY].[MM].[DD]. Based on the documentation for the fluentd elasticsearch plugin it seems that you can create a custom index by setting the "index_name" property.

I've tried this on both the log forwarder and the log aggregator but I still seem to get the default index name in elasticsearch. Is there something else required to customize this index name in a HA setup?

Here is the log forwarder config:

<source>
  type tail
  path /var/log/debug.json
  pos_file /var/log/debug.pos
  tag asdf
  format json
  index_name fluentd
  time_key time_field
</source>

<match *>
  type copy
  <store>
    type stdout
  </store>
  <store>
    type forward
    flush_interval 10s
    <server>
      host [fluentd aggregator]
    </server>
  </store>
</match>

And here is the log aggregator config:

<source>
  type forward
  port 24224
  bind 0.0.0.0
</source>

<match *>
  type copy
  <store>
    type stdout
  </store>
  <store>
    type elasticsearch
    host localhost
    port 9200
    index_name fluentd
    type_name fluentd
    logstash_format true
    include_tag_key true
    flush_interval 10s # for testing
  </store>
</match>
2

2 Answers

0
votes

I found an issue on the fluent-plugin-elasticsearch repo that explains this behavior. When setting the "logstash_format" option to true the "index_name" field is ignored.

0
votes

remove logstash_format true from .You will get your custom index.But you will not get timestamp in your data.For getting timestamp you have to update version of ruby and then pass time format to config file of fluentd.