1
votes

I am following filebeat->logstash->elasticsearch->kibana pipeline. filebeat successfully working and fetching the logs from the target file.

Logstash receiving the logs on input plugin and bypassing the filter plugin and sending over to the output plugin.

filebeat.yml

  

         # ============================== Filebeat inputs ===============================
            
                filebeat.inputs:
                       
                - type: log
                  enabled: true
                  paths:
                   - D:\serverslogs\ch5shdmtbuil100\TeamCity.BuildServer-logs\launcher.log  
                  fields: 
                    type: launcherlogs
                
                - type: filestream
                
                  # Change to true to enable this input configuration.
                  enabled: false
                
                  # Paths that should be crawled and fetched. Glob based paths.
                  paths:
                    - /var/log/*.log
               
                # =================================== Kibana ===================================
                setup.kibana:
                  host: "localhost:5601"
        
    
    # ------------------------------ Logstash Output -------------------------------
        output.logstash:
          # The Logstash hosts
          hosts: ["localhost:5044"]

logstash.conf
input{
            beats{
                    port => "5044"
            }
    
    }
    
    filter {
          
           
        
         if [fields][type] == "launcherlogs"{ 
    
           grok {
    
                 match => {"message" =>%{YEAR:year}-%{MONTH:month}-%{MONTHDAY:day}%{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}}
                    
           }
        }
      
    }
    output{
            elasticsearch{
                    hosts => ["http://localhost:9200"]
                    index => "poclogsindex"
            }
    }

I am able to send the logs on kibana but the grok debugger scripts is not rendering desired json on kibana. The data json rendered on Kibana is not showing all the attributes passed in the script. Please advise.

1
Could you send a sample of data + the output on kibana side?YLR
@YLR sure sample of log:- [2021-01-30 23:43:55,248] DEBUG - buildServer.agent.LauncherUtil - Deleting D:\TeamCity.BuildServer\backup\bin\install.bat....... The output is only this logs showing on the kibana with different fields apart from log level and class which means the logs is not parsing through the grok.Muhammad Suleman

1 Answers

0
votes

Your grok pattern does not match the sample you gave in comment : several parts are missing (the brackets, the HH:mm:ss,SSS part and an additionnal space). Grok debuggers are your friends ;-)

Instead of :

%{YEAR:year}-%{MONTH:month}-%{MONTHDAY:day}%{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}

Your pattern should be :

\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}

TIMESTAMP_ISO8601 matches this date/time.

Additionnally, I always doubled-quote the pattern, so the grok part would be :

grok {
    match => {"message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:loglevel}%{SPACE}-%{SPACE}%{DATA:class}%{SPACE}-%{GREEDYDATA:message}"}
}