0
votes

I am working on an ELK-stack configuration. logstash-forwarder is used as a log shipper, each type of log is tagged with a type-tag:

{
  "network": {
    "servers": [ "___:___" ],
    "ssl ca": "___",
    "timeout": 15
  },
  "files": [
    {
      "paths": [
        "/var/log/secure"
      ],
      "fields": { 
        "type": "syslog"
      }
    }
  ]
}

That part works fine... Now, I want logstash to split the message string in its parts; luckily, that is already implemented in the default grok patterns, so the logstash.conf remains simple so far:

input {
    lumberjack {
        port => 6782
        ssl_certificate => "___" ssl_key => "___"
    }
}
filter {
    if [type] == "syslog" {
        grok {
            match => [ "message", "%{SYSLOGLINE}" ]
        }
    }
}
output {
    elasticsearch {
        cluster => "___"
        template => "___"
        template_overwrite => true
        node_name => "logstash-___"
        bind_host => "___"
    }
}

The issue I have here is that the document that is received by elasticsearch still holds the whole line (including timestamp etc.) in the message field. Also, the @timestamp still shows the date of when logstash has received the message which makes is bad to search since kibana does query the @timestamp in order to filter by date... Any idea what I'm doing wrong?

Thanks, Daniel

2

2 Answers

3
votes

The reason your "message" field contains the original log line (including timestamps etc) is that the grok filter by default won't allow existing fields to be overwritten. In other words, even though the SYSLOGLINE pattern,

SYSLOGLINE %{SYSLOGBASE2} %{GREEDYDATA:message}

captures the message into a "message" field it won't overwrite the current field value. The solution is to set the grok filter's "overwrite" parameter.

grok {
    match => [ "message", "%{SYSLOGLINE}" ]
    overwrite => [ "message" ]
}

To populate the "@timestamp" field, use the date filter. This will probably work for you:

date {
    match => [ "timestamp", "MMM dd HH:mm:ss", "MMM  d HH:mm:ss" ]
}
0
votes

It is hard to know were the problem without seeing an example event that is causing you the problem. I can suggest you to try the grok debugger in order to verify the pattern is correct and to adjust it to your needs once you see the problem.