I can't parse a date field using the logstash date plugin, my config is as follow:
if "test" in [tags] {
csv {
separator => ","
columns => [ "value", "received_date" ]
convert => {
"value" => "float"
}
}
mutate {
gsub => [ "received_date" , ".\d*$" , ""]
}
date {
match => [ "received_date", "yyyy-MM-dd HH:mm:ss" ]
}
}
I get the error:
[2018-06-19T11:51:20,583][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"f2d34d84-1ea4-4510-8237-2329a4d1ffba", :_index=>"logstash-2018.06.19", :_type=>"doc", :_routing=>nil}, #], :response=>{"index"=>{"_index"=>"logstash-2018.06.19", "_type"=>"doc", "_id"=>"f2d34d84-1ea4-4510-8237-2329a 4d1ffba", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [received_date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2018-06-19 11:51:15\" is malformed at \" 11:51:15\""}}}}}
If I add a target:
date {
match => [ "received_date", "yyyy-MM-dd HH:mm:ss" ]
target => "received_date"
}
Then it works, but the timestamp field takes the date logstash received the input, which is not what I want.
Why would the target impact the date parsing ?