I'm having issues with grok parsing. In ElasticSearch/Kibana the lines I match come up with the tag _grokparsefailure.
Here is my logstash config :
input {
snmptrap {
yamlmibdir => "/opt/logstash/vendor/bundle/jruby/1.9/gems/snmp- 1.2.0/data/ruby/snmp/mibs"
codec => plain {
charset => "BINARY"
}
type => "snmptrap"
}
}
filter {
if [type] == "snmptrap"
{
grok {
match => { "message" => "%{IP:@source_ip=\\""}" }
add_field => { "source_ip" =>"%{@source_ip=\"}" }
}
}
}
output {elasticsearch { hosts => localhost }
}
my input look like this below.
"message" => "#@enterprise=[1.3.6.1.3.92.1.1.7], @timestamp=#@value=802993822>, @varbind_list=[#@name=[1.3.6.1.3.92.1.1.5.1.3.202.169.174.90], @value=#@value=1>>], @specific_trap=2, @source_ip=\"10.10.10.12\", @agent_addr=#@value=\"\xC0\xA8\a\f\">, @generic_trap=6>",
how to get the @source_ip and add a new field for the value ?
Can somebody give me a hint how I can fix the problem?