1
votes

My log file has this pattern:

[Sun Oct 30 17:16:09 2016] [TRACE_HIGH] [TEST1] MessageTest1
[Sun Oct 30 17:16:10 2016] [TRACE_HIGH] [TEST2] MessageTest2

Pattern:

\A\[%{HTTPDERROR_DATE}](?<message>(.|\r|\n)*)

Filter:

filter {
  if [type] == "mycustomlog" {
    grok {
      match => { "message" => "\A\[%{HTTPDERROR_DATE}](?<message>(.|\r|\n)*)"}
    }
    date {
      # Format: Wed Jan 13 11:50:44.327650 2016 (GROK: HTTPDERROR_DATE)
      match => [ "timestamp", "EEE MMM dd HH:mm:ss yyyy"]     
    }
    multiline {
      pattern => "^%{SYSLOG5424SD}%{SPACE}"
      what => "previous"
      negate=> true
    }
  }
} 

I am trying to use my datetime log into @timestamp field, but I cannot parse this format into @timestamp. Why the date filter did not replace the @timestamp value?

My @timestamp is different from the log row:

row[0]

  • @timestamp: [Wed Nov 2 15:56:42 2016]
  • message: [Wed Nov 2 15:56:41 2016]

enter image description here

I am following this tutorial:

https://www.digitalocean.com/community/tutorials/adding-logstash-filters-to-improve-centralized-logging

Using:

Elasticsearch 2.2.x, Logstash 2.2.x, and Kibana 4.4.x

Grok Constructor Print: enter image description here

3
Could you show the full content of the logs (from Kibana or ES)? Because it looks like the problem is not with the date parsing but with the grok filter. When testing it grokconstructor.appspot.com/do/match#result, it does not work. (and so the timestamp field does not exist and the date filter does have a field to work on).baudsp
I am not a regex expert, but I think the problem comes from the second part, with trying to match the \r and \nbaudsp
I will try just with the timestamp and Iĺl back with more informationVitorlui
@baudsp sorry I put another version on the pattern, try with this one please..Vitorlui
The one you've put (\A\[%{HTTPDERROR_DATE}](?<message>(.|\r|\n)*)) does not create a field from the %{HTTPDERROR_DATE}. You need to have %{HTTPDERROR_DATE:timestamp}baudsp

3 Answers

1
votes

The grok pattern used, \A\[%{HTTPDERROR_DATE}](?<message>(.|\r|\n)*) does not create a field from the %{HTTPDERROR_DATE}.
You need to have %{pattern:field} so that the data captured by the pattern creates a field (cf documentation).

So in your case it would be like this:

\A\[%{HTTPDERROR_DATE:timestamp}](?<message>(.|\r|\n)*)
1
votes

I think Elasticsearch/Kibana @timestamp doesn't support "EEE MMM dd HH:mm:ss yyyy" format. Hence, you can bring the timestamp to the format "dd/MMM/yyyy:HH:mm:ss.SSSSSS" using mutate processor.

Snippet as below:

grok {
                match => [ "message", "\[%{DAY:day} %{MONTH:month} %{MONTHDAY:monthday} %{TIME:time} %{YEAR:year}\] %{GREEDYDATA:message}" ]
    }
   mutate {
                add_field => {
                                "timestamp" => "%{monthday}/%{month}/%{year}:%{time}"
        }
    }
   date {
                locale => "en"
                timezone => "UTC"
                match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss.SSSSSS"]
                target => "@timestamp"
                remove_field => ["timestamp", "monthday", "year", "month", "day", "time"]
    }

It may help someone. Thanks!

1
votes

To apply the new field you must enter the target to overwrite the field:

target => "@timestamp"

By example:

 date {
      match => [ "timestamp", "dd MMM yyyy HH:mm:ss" ]
      target => "@timestamp"
      locale => "en"
      remove_field => [ "timestamp" ]
    }