I have a json file with records like this one
{"id":1,"first_name":"Frank","last_name":"Mills","date":"5/31/2014","email":"[email protected]","country":"France","city":"La Rochelle","latitude":"46.1667","longitude":"-1.15"
and I'm trying to filter the fields in logstash, unsuccessfully so far. I tried the grok debugger and the grokconstructor but cannot make it work. My last attempt is
input {
file{
path => ["C:/logstash-1.4.2/mock_data.json"]
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
mutate {
replace => [ "message", "%{message}" ]
}
json {
source => "message"
remove_field => "message"
}
mutate {
convert => [ "latitude", "float" ]
convert => [ "longitude","float" ]
}
mutate {
rename => [ "latitude", "[location][lat]", "longitude", "[location][lon]" ]
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
host => "127.0.0.1"
protocol => "http"
index => "test35"
}
}
just for the latitude and longitude but that doesn't work. Any tutorial for logstash on Json particularly. Any help on this. The output for the specific configuration file is
{
"message" => "{\"id\":91,\"first_name\":\"Adam\",\"last_name\":\"Carr\",\"date\":\"11/14/2014\",\"email\":\"acarr2i@tinyurl.
com\",\"country\":\"Ghana\",\"city\":\"Mampong\",\"latitude\":\"7.06273\",\"longitude\":\"-1.4001\"},",
"@version" => "1",
"@timestamp" => "2015-05-04T19:05:08.409Z",
"host" => "Toshiba",
"path" => "C:/logstash-1.4.2/mock_data.json",
"tags" => [
[0] "_jsonparsefailure"
]
}
Updated for Alcanzar