1
votes

This is kind of a follow up from another one of my questions: JSON parser in logstash ignoring data? But this time I feel like the problem is more clear then last time and might be easier for someone to answer.

I'm using the JSON parser like this:

json #Parse all the JSON
{
    source => "MFD_JSON"
    target => "PARSED"
    add_field => { "%{FAMILY_ID}" => "%{[PARSED][platform][family_id][1]}_%{[PARSED][platform][family_id][0]}" }
}

The part of the output for one the logs in logstash.stdout looks like this:

        "FACILITY_NUM" => "1",
       "LEVEL_NUM" => "7",
         "PROGRAM" => "mfd_status",
       "TIMESTAMP" => "2016-01-12T11:00:44.570Z",
       MORE FIELDS

There are a whole bunch of fields that like the ones above that work when I remove the JSON code. When I add the JSON filter, the whole log just disappears form elasticserach/kibana for some reason. The bit added by the JSON filter is bellow:

"PARSED" => {  
    "platform" => {
               "boot_mode" => [
            [0] 2,
            [1] "NAND"
        ],
                "boot_ver" => [
            [0] 6,
            [1] 1,
            [2] 32576,
            [3] 0
        ],
            WHOLE LOT OF OTHER VARIABLES

               "family_id" => [
            [0] 14,
            [1] "Hatchetfish"
        ],
            A WHOLE LOT MORE VARIABLES
    },
       "flash" => [
        [0] 131072,
        [1] 7634944
    ],
      "can_id" => 1700,
     "version" => {
          "kernel" => "3.0.35 #2 SMP PREEMPT Thu Aug 20 10:40:42 UTC 2015",
        "platform" => "17.0.32576-r1",
         "product" => "next",
             "app" => "53.1.9",
            "boot" => "2013.04 (Aug 20 2015 - 10:33:51)"
    }
},
    "%{FAMILY_ID}" => "Hatchetfish 14"

Lets pretend the JSON won't work, I'm okay with that now, that shouldn't mess with everything else to do with the log from elasticsearch/kibana. Also, at the end I've got FAMILY_ID as a field that I added separately using add_field. At the very least that should show up, right?

If someone's seen something like this before it would be great help. Also sorry for spamming almost the same question twice.

SAMPLE LOG LINE:

1452470936.88 1448975468.00 1 7 mfd_status 000E91DCB5A2 load {"up":[38,1.66,0.40,0.13],"mem":[967364,584900,3596,116772],"cpu":[1299,812,1791,3157,480,144],"cpu_dvfs":[996,1589,792,871,396,1320],"cpu_op":[996,50]}

The sample line will be parsed (Everything after load is JSON), and in stdout I can see that it is parsed successfully, But I don't see it in elasticsearch. This is my output code:

elasticsearch 
{ 
hosts => ["localhost:9200"] 
document_id => "%{fingerprint}"
}
stdout { codec => rubydebug }

A lot of my logstash filter is in the other question, but I think like all the relevant parts are in this question now. If you want to check it out here's the link: JSON parser in logstash ignoring data?

1
If you want people to help out on this one, you need to share your logstash config (at least the relevant parts) as well as one real sample log line that you know is failing with your current configuration.Val
@Val I updated the question to have some logs and code. A lot of my logstash config is in my previous question, except I swapped out grok for csv parser, but that shouldn't be affecting this because it was doing the same thing before.Swikrit Khanal
@Val I just tested it with all of the filter removed except for the JSON filter (with just the source setting), and only with the log example in the question (Just the JSON part of it). It parsed like before in the logstash.stdout file, but didn't show in kibana. So I have to assume either my output is wrong or I'm using the JSON filter wrong.Swikrit Khanal

1 Answers

1
votes

Answering my own question here. It's not the ideal answer, but if anyone has a similar problem as me you can try this out.

json #Parse all the JSON
{
    source => "MFD_JSON"
    target => "PARSED"
    add_field => { "%{FAMILY_ID}" => "%{[PARSED][platform][family_id][1]}_%{[PARSED][platform][family_id][0]}" }
}

That's how I parsed all the JSON before, I kept at the trial and error hoping I'd get it sometime. I was about to just use a grok filter to get bits that I wanted, which is a option if this doesn't work for you. I came back to this later, and thought "What if I removed everything after" because of some crazy reason that I've forgotten. In the end I did this:

            json
        {
            source => "MFD_JSON"
            target => "PARSED_JSON"
            add_field => { "FAMILY_ID" => "%{[PARSED_JSON][platform][family_id][1]}_%{[PARSED_JSON][platform][family_id][0]}"  }
            remove_field => [ "PARSED_JSON" ]
        }

So, extract the field/fields your interested in, and then remove the field made by the parser at the end. That's what worked for me. I don't know why, but it might work for other people too.