1
votes

I'm trying to add a custom pattern to Logstash in order to capture data from this kind of log line:

[2017-11-27 12:08:22] production.INFO: {"upload duration":0.16923}

I followed the instructions on Logstash guide for grok and created a directory called patterns with a file in it called extra that contain:

POSTFIX_UPLOAD_DURATION upload duration

and added the path to the config file:

grok {
        patterns_dir => ["./patterns"]
        match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{POSTFIX_UPLOAD_DURATION: upload_duration} %{DATA:log_env}\.%{LOGLEVEL:severity}: %{GREEDYDATA:log_message}" }
    }

However, I'm getting this error message:

Pipeline aborted due to error {:exception=>#<Grok::PatternError: pattern %{POSTFIX_UPLOAD_DURATION: upload_duration} not defined>

Also, some log lines don't contain the 'upload duration' field, will this break the pipeline?

2

2 Answers

0
votes

You are able to use relative directories, as long as they are relative to the current working directory of where the process starts, not relative to the conf file or to Logstash itself.

0
votes

I found out that there is better and more efficint way to capture data using the json plugin.

I've add "log_payload:" in my logs and insert the data I need to capture in a json object. Then I've used this pipeline to capture it.

 if ("log_payload:" in [log_message]) {
        grok{
            match => {"log_message" => 'log_payload:%{DATA:json_object}}%{GREEDYDATA}'}
        }
        mutate{
            update => ["json_object", "%{[json_object]}}"]
        }
        json {
           source => "json_object"
          }
    }
     mutate {
                remove_field => ["log_message", "json_object"]
            }
 }