I am trying to figure out how to read in a JSON file with Logstash that contains all of the events on one single line
Sample input:
{"metadata": {"metadata fields": "metadata data"},"results": [{"events":[{"event fields": "event data"}, {"event fields": "event data"}}],"field": {"more fields": "data"}}
Expanded JSON:
{
"metadata": {
"metadata fields": "metadata data"
},
"results": [{
"events": [{
"event fields": "event data"
}, {
"event fields": "event data"
}
}], "field": {
"more fields": "data"
}
}
I have tried just using the JSON codec however, when I run Logstash it hangs after print successfully started. Another thing I have tried that gets it to work is adding a newline at the end of the JSON but this won't work in production because I have no control over the source of the logs.
Does anyone have any suggestions on how to correctly parse these logs through Logstash? Below is my configuration file. Thanks in advance!
Config
input {
file {
path => "C:/Folder/*.json"
sincedb_path => "C:\nul"
start_position => "beginning"
codec => "json"
type => "data"
}
}
output {
stdout { codec => rubydebug }
}
grokfilter will do the job. - Fairy