1
votes

I have a json file with this structure:

{"vhost": "8.8.8.8", "host": "8.8.8.8", "port": 80, "ip": "189.191.116.19", "host.latitude":"40.408599853515625","host.longitude":"-3.692199945449829"}

And I have a logstash config file like this:

input {
  stdin {
    type => "json"
  }
}
filter{
    json{
        source => "message"
    }
}

output {
        stdout{}
}

I want to transform the json records so that I can use the geo_point type in ES and Kibana. According to the official documentation in https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-geo-point-type.html the latitude and longitude fields have to be in a format like this:

"pin" : {
        "location" : {
            "lat" : 41.12,
            "lon" : -71.34
        }
    }

My question is, how can I transform from logstash the json input so that I can use the geo_point?. I guess that I can not modify the json structure itself but maybe I can add a new field with the location estructure required by the geo_point type.

Thank you

1

1 Answers

1
votes

The simplest way to achieve this is to concatenate the two field host.latitude and host.longitude into another field that can be mapped as geopoint as string.

You just need to add one more mutate filter after your json filter like this:

filter{
    json{
        source => "message"
    }
    mutate {
       add_field => { "location" => "%{host.latitude},%{host.longitude}" }
    }
}

As a result, the location field will be created which you can map as a geo_point type in your mapping.