I'm working on parsing the timestamp from couchdb log. The entire timestamp is getting processed correctly when seen in stdout, however _grokparsefailure is observed when viewing with Kibana on top of elasticsearch.
Ex logline :
[Thu, 31 Jul 2014 17:14:28 GMT] [info] [<0.23047.647>] 10.30.50.48 - - HEAD /z_775-797_api_docs 200
i've followed these links in parsing the date format : http://logstash.net/docs/1.4.2/filters/date , http://joda-time.sourceforge.net/apidocs/org/joda/time/format/DateTimeFormat.html
my parse code is :
grok{
match => { "message" => "%{SYSLOG5424SD:Log_Time} \[info] %{SYSLOG5424SD:response_time} %{IPV4:ip_address} - - %{WORD:http_method} %{URIPATH} %{INT:file_value}" }
}
date{
match => ["Log_Time","[EEE, dd MMM YYYY HH:mm:ss zzz]"]
}
My output code is :
output {
elasticsearch { host => localhost }
stdout { codec => json }
}
where Log_Time = [Thu, 31 Jul 2014 17:14:28 GMT] and the output timestamp is
"@timestamp":"2014-07-31T17:14:28.000Z"
The data displayed under stdout is "@timestamp":"2014-07-31T17:14:28.000Z" without any grok parse error but in Kibana the time of parsing is coming as timestamp and the _grokparseerror tag is present. I couldn't understand why there is this difference between standard out and Kibana. I tried deleting all elasticsearch indexes and the .sincedb files but still the errors remain.
Please help if you have any ideas
dateblock, you need sayremove_field => timestamp. - Engineer2021