6
votes

I have been following this guide:

http://deviantony.wordpress.com/2014/06/04/logstash-debug-configuration/

Which I'm hoping will help me test my logstash filters to see if I get the desired output before using them full time.

As part of the guide it tells you to set up an input and output and then a filter file. the input seems to work fine:

input {
    stdin { }
}

The output is this:

output {
    stdout {
        codec => json
    }
    file {
        codec => json
        path => /tmp/debug-filters.json
    }
}

I am getting the following error when I try to run the logstash process (here I've run it with --configtest as the error advises me to try that, but it doesn't give any more information):

# /opt/logstash/bin/logstash -f /etc/logstash/debug.d -l /var/log/logstash/logstash-debug.log --configtest

Sending logstash logs to /var/log/logstash/logstash-debug.log.
Error: Expected one of #, ", ', -, [, { at line 21, column 17 (byte 288) after output {
    stdout {
        codec => json
    }
    file {
        codec => json
        path =>

I have tried removing the file section in my output and I can get the logstash process running, but when I paste my log line in to the shell I don't see the log entry broken down in to the components I am expecting the grok filter to break it in to. All I get when I do that is:

Oct 30 08:57:01 VERBOSE[1447] logger.c: == Manager 'sendcron' logged off from 127.0.0.1
{"message":"Oct 30 08:57:01 VERBOSE[1447] logger.c: == Manager 'sendcron' logged off from 127.0.0.1","@version":"1","@timestamp":"2014-10-31T16:09:35.205Z","host":"lumberjack.domain.com"}

Initially I was having a problem with a new grok filter, so I have now tried with an existing filter that I know works (as shown above it is an Asterisk 1.2 filter) and has been generating entries in to elasticsearch for some time.

I have tried touching the json file mentioned in the output, but that hasn't helped.

When I tail the logstash-debug.log now I just see the error that is also being written to my shell.

Any suggestions on debugging grok filters would be appreciated, if I have missed something blindingly obvious, apologies, I've only been working with ELK & grok for a couple of weeks and I might not be doing this in the most sensible way. I was hoping to be able to drop example log entries in to the shell and get the JSON formatted logstash entry to my console so I could see if my filter was working as I hoped, and tagging them up as they will be displayed in kibana at the end. If there is a better way to do this please let me know.

I am using logstash 1.4.2

2

2 Answers

10
votes

As far as debugging a grok filter goes, you can use this link (http://grokdebug.herokuapp.com/) It has a very comprehensive pattern detector which is a good start.

As far your file output, you need "" around your path. Here is the example i use in production. Here is the documentation on file output http://logstash.net/docs/1.4.2/outputs/file#path

output {
    stdout {
        codec => rubydebug
    }
    file {
        codec => "plain"
        path => "./logs/logs-%{+YYYY-MM-dd}.txt"
    }
}
5
votes

The Grokconstructor is a similar Grok debugger to Grokdebug which @user3195649 mentioned. I like it's random examples.