1
votes

I'm having some problems getting logstash to recognize my pattern which seems to match on the Grok Debugger (https://grokdebug.herokuapp.com/).

It's a similar problem to the one found on this other StackOverflow question (logstash _grokparsefailure issues), but unfortunately the solution there does not seem to work.

These are the logs I'm trying to match:

Mon Jan 25 11:12:12.890 [conn44141] authenticate db: admin { authenticate: 1, user: "person", nonce: "f00000000f", key: "a0000000000e" }

"2015-01-25 14:46:31"   id=Admin      id=Admin,ou=user,dc=gooogle-wa,dc=com       a000000a      100.00.00.01    INFO    dc=gooooogle-wa,dc=com  "cn=user,ou=AME Users,dc=goooogle,dc=com"    BARF-4       aO.access    "Not Available" 100.00.00.01

The pattern I'm using to parse these is, respectively:

 if [type] == "openam" {
       if [file] =~ "access" {
          grok{
               match => [ 'message', '\"%{TIMESTAMP_ISO8601:timestamp}\"(\s*)(%{QUOTEDSTRING:data_}|%{DATA:data_})(\s*)(%{QUOTEDSTRING:LoginID}|%{DATA:LoginID})(\s*)%{DATA:ContextID}(\s*)(\"%{DATA:IP}\"|%{IP:IP})(\s*)?%{LOGLEVEL:loglevel}(\s*)%{DATA:Domain}(\s*)\"%{DATA:LoggedBy}\"(\s*)(?<messageID>[a-zA-Z0-9._-]+)(\s*)(%{DATA:ModuleName})(\s*)\"%{DATA:NameID}\"(\s*)(%{IP:hostname}|%{GREEDYDATA:hostname}) '
                    ]
               add_tag => "openam_access"
          }
       }
       else if [file] =~ "error" {
           grok{
                match => ['message', '\"%{TIMESTAMP_ISO8601:timestamp}\"(\s*)(%{QUOTEDSTRING:data_}|%{DATA:data_}) (\s*)(%{QUOTEDSTRING:LoginID}|%{DATA:LoginID}) (\s*)%{DATA:ContextID}(\s*)(\"%{DATA:IP}\"|%{IP:IP})(\s*)?%{LOGLEVEL:loglevel}(\s*)%{DATA:Domain}(\s*)\"%{DATA:LoggedBy}\"(\s*)(?<messageID>[a-zA-Z0-9._-]+)(\s*)(%{DATA:ModuleName})(\s*)\"%{DATA:NameID}\"(\s*)(%{IP:hostname}|%{GREEDYDATA:hostname})',
                    ]
                add_tag => "openam_error"
           }
       }
 }




  if [type] == "mongo" {
    grok {
      match => [
                  "message", "(?m)%{GREEDYDATA} \[conn%{NUMBER:mongoConnection}\] %{WORD:mongoCommand} %{WORD:mongoDatabase}.%{NOTSPACE:mongoCollection} %{WORD}: \{ %{GREEDYDATA:mongoStatement} \} %{GREEDYDATA} %{NUMBER:mongoElapsedTime:int}ms",
                  "message", "%{DATA:DayOfWeek} %{SYSLOGTIMESTAMP:timestamp} %{DATA:Thread} %{GREEDYDATA:msg} %{IP:ip}:%{NUMBER:port} ?#?%{NUMBER:ID}? %{GREEDYDATA:connections} ",
                  'message', '%{DATA:DayOfWeek} %{SYSLOGTIMESTAMP:timestamp} %{DATA:Thread} %{DATA:msg}: %{WORD:userType} \{ authenticate: %{NUMBER:authenticate}, user: %{QS:user}, nonce: %{QS:nonce}, key: %{QS:key} \}'
               ]
      add_tag => "mongodb"
       }

}

As you can check, the patterns will work fine on the debugger but for some reason on my kibana dashboard they show up with the _grokparsefailure tag. I suspect the it has either to do with me escaping characters or the use of {QS}/{QOUTEDSTRING}.

Thanks

2

2 Answers

2
votes

Your patterns appear to be fine, but with

filter {
  grok {
    ...
  }
  grok {
    ...
  }
}

you're applying both patterns to all input strings, and an input string that matches the first pattern will never match the second and vice versa. Hence you always get the _grokparsefailure tag.

Do this instead:

filter {
  grok {
    match => ['message', 'pattern1',
              'message', 'pattern2']
  }
}

If you really have to use different grok filters, condition their inclusion with a sneak peak of the message:

filter {
  if [message] =~ /^(Mon|Tue|Wed|Thu|Fri|Sat|Sun) / {
    grok {
      match => ['message', 'pattern1']
    }
  }
  ...
}

This will obviously be slower and means you'll have more regular expressions to maintain.

0
votes

I've figured it out. It seems there was another error which was preventing my logstash conf from updating. Highly recommend the ./logstash --configtest for anyone in a similar spot.