0
votes

I am trying to forward my local server log from windows to an elasticsearch server in a linux machine and check these logs in the kibana. This is test environment currently. fuentd on either ends is not showing any issues. But there is no index created in kibana. Not sure what the issue here is. Please find the config files of the two servers below.

One more question is there any way to know where a forwarded log is stored #in destination server.

Note: I have configured Ruby 2.6.3 in my Rhel 7.5 machine. I have installed fluentd through Gem install method and I also installed fluentd-plugin-elasticsearch 3.5.5 and elasticsearch 7.2.1 gems which are compatible with my current elasticsearch version.

Forwarder Windows server

    <source>
           @type tail
           tag server
           path C:\sw_logs\server.log 
           pos_file C:\opt\pos_files\server.log.pos
    <parse>
        @type json
    </parse>
    </source>
    <match server>a
        @type stdout
    </match>
    <match server>
        @type forward
        send_timeout 60s
    <server>
        host x.x.x.154
        port 24224
    </server>
    <buffer>
              retry_max_times 3
              retry_randomize false
              retry_max_interval 32s
              retry_timeout 1h
              path /var/log/fluentd/forward.log
     </buffer>
     </match>

Aggreagation and Elasticsearch forward config Rhel 7.5

     <source>
               @type forward
               port 24224
      </source>
      <match server>
                @type copy
      <store>
                @type elasticsearch
                path /var/log/fluentd/forward.log
                host x.x.x.154
                port 9200
                logstash_format false
                index_name fluentd.${tag}.%Y%m%d
                type_name fluentd
                type_name "_doc"
                #New Change begin
                utc_index true
                #End new Change
                verify_es_version_at_startup false
                default_elasticsearch_version 7.x
                max_retry_get_es_version 1
                max_retry_putting_template 1
       <buffer>
                @type file
                path /var/log/ge_efk_logdata/buffer/win29.buffer/
       # Buffer Parameters
                flush_thread_count 3
                chunk_limit_size 16MB
                total_limit_size 4GB
                queue_limit_length
                chunk_full_threshold 0.85
                compress gzip
                retry_timeout
        # Flush Parameters
                flush_at_shutdown false
                #Assuming persistent buffers
                flush_mode immediate
                #flush_interval 60s
                flush_thread_count 2
                flush_thread_interval 1.0
                flush_thread_burst_interval 1.0
                delayed_commit_timeout 60s
                overflow_action throw_exception
        # Retry Parameters
                retry_timeout 1h
                retry_forever false
                retry_max_times 5
             </buffer>
        </store>
</match>
1

1 Answers

0
votes

Can you share fluentd and elasticsearch logs and try the following configuration :

<match *.**>
  @type copy
    <store>
      @type elasticsearch
      host x.x.x.154
      port 9200
      include_tag_key true
      logstash_format true
      logstash_prefix fluentd
      flush_interval 10s
    </store>
</match>