I have a logstash pipeline that extracts a date from an apache log entry and saves it in a new field:
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
target => "@apache_timestamp"
}
I'd also like to be able to extract parts of this date into separate fields, for some specific reports.
I've tried using the date
plugin on the new date field from the log:
date {
match => ["@apache_timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
add_field => {"[hourOfDay]" => "%{+HH}"}
add_field => {"[dayOfWeek]" => "%{+EEE}"}
add_field => {"[weekOfYear]" => "%{+ww}"}
add_field => {"[monthName]" => "%{+MMMM}"}
add_field => {"[year]" => "%{+yyyy}"}
}
But it doesn't seem to add any new fields.
I've also tried using the grok plugin directly on the message:
grok {
match => { "message" => ["%{HTTPDATE}"] }
add_field => {"[hourOfDay]" => "%{HOUR}"}
add_field => {"[monthName]" => "%{MONTH}"}
add_field => {"[year]" => "%{YEAR}"}
}
This adds the fields, but they have the literal values %{HOUR}
, %{MONTH}
, etc...
How can I extract fields like "Day of week" and "week of year" from the Apache timestamp?
(I was able extract the values I need using Kibana's scripted fields, but they seemed rather slow and Kibana can't query scripted fields so it's not a great solution.)
Using Logstash 6.0