0
votes

I'm using Logstash to process my logs and store them to Elastic Search. I'm using http as input plugin for my logstash.

My http post request is:

$http.post(url, {type: 'reference error', message: 'y is not defined'});

I would like to store the type and message key as different fields in Elastic Search.

Currently all of the post data is stored as a single field like:

"message":"{\"type\":\"ReferenceError\",\"message\":\"y is not defined\"}"

I think this can be done using grok filter but I have not been able to find a way to do this.

Any help is highly appreciated. Thanks.

2

2 Answers

3
votes

If you use the json codec, the information should be split out into fields for you automatically.

2
votes

EDIT: As Alain mentioned it is the best way to use the json codec which can be set directly in your http input plugin. If that is not possible for some reason you can use the grok filter.

If I understand you correctly your incoming event looks like this:

{"type": "reference error", "message": "y is not defined"}

Then a corresponding grok pattern would look like this:

{"type": %{QUOTEDSTRING:http_type}, "message": %{QUOTEDSTRING:http_message}}

In your logstash configuration:

grok {
    match => [ "message", "{\"type\": %{QUOTEDSTRING:http_type}, \"message\": %{QUOTEDSTRING:http_message}}" ]
}

Then the result will have the two fields http_type and http_message.