1
votes

I'm having issues with grok parsing. In ElasticSearch/Kibana the lines I match come up with the tag _grokparsefailure.

Here is my logstash config :

input { 
 snmptrap {
 yamlmibdir => "/opt/logstash/vendor/bundle/jruby/1.9/gems/snmp- 1.2.0/data/ruby/snmp/mibs"
 codec => plain {
 charset => "BINARY"
         }
 type => "snmptrap"
         }
       }

filter {

   if [type] == "snmptrap"
          {
              grok {
              match => { "message" => "%{IP:@source_ip=\\""}" }
              add_field => { "source_ip" =>"%{@source_ip=\"}" }
          }
       }

     }

    output {elasticsearch { hosts => localhost }


      }

my input look like this below.

"message" => "#@enterprise=[1.3.6.1.3.92.1.1.7], @timestamp=#@value=802993822>, @varbind_list=[#@name=[1.3.6.1.3.92.1.1.5.1.3.202.169.174.90], @value=#@value=1>>], @specific_trap=2, @source_ip=\"10.10.10.12\", @agent_addr=#@value=\"\xC0\xA8\a\f\">, @generic_trap=6>",

how to get the @source_ip and add a new field for the value ?

Can somebody give me a hint how I can fix the problem?

1

1 Answers

0
votes

I'm surprised your config even compiles. You should poke around a little more about how grok works. For example, you typically don't need to use add_field with grok.

The grok pattern includes the stuff that comes before and after the data you want to capture. These items "anchor" your regular expression. In your case, it's the '@source_ip="' to start and the '"' to end. The stuff in the middle is what you really want to capture into a new field.

Your pattern would look more like this:

match => { "message" => "@source_ip=\\\"%{IP:source_ip}\\\"" }

This should give you a new "source_ip" field that contains the value.

The \\\" escapes the slashes and the quotes and will match \".