Summary:
logstash -> elasticsearch --> Failed parsing date shown in debug output
Events in logfile contain field @timestamp (format: 2014-06-18T11:52:45.370636+02:00)
Events are actually processed to elasticsearch but 'failed parsing' errors are shown.
Versions:
Logstash 1.4.1
Elasticsearch 1.20
Is there something I do wrong?
I have log files that contain events like this:
{"@timestamp":"2014-06-18T11:52:45.370636+02:00","Level":"Info","Machine":"X100","Session":{"MainId":0,"SubId":"5otec"},"Request":{"Url":"http://www.localhost:5000/Default.aspx","Method":"GET","Referrer":"http://www.localhost:5000/Default.aspx"},"EndRequest":{"Duration":{"Main":0,"Page":6720}}}
I use this logstash config:
input {
file {
path => "D:/testlogs/*.log"
codec => "json"
start_position => [ "beginning" ]
}
}
filter {
date {
match => [ "@timestamp", "ISO8601" ]
}
}
output {
stdout {
codec => json
}
elasticsearch {
protocol => http
host => "10.125.26.128"
}
}
When I run logstash with this config on the events in the log files I get the following error:
[33mFailed parsing date from field {:field=>"@timestamp", :value=>"2014-06-18T12:18:34.717+02:00", :exception=>#<TypeError: cannot convert instance of class org.jruby.RubyTime to class java.lang.String>
Now the thing is that actually the events are imported in elasticsearch, but I see these errors.
Can this be a problem or can these failed parsing errors being ignored?