0
votes

I've got a very basic ELK stack setup and passing logs to it via syslog. I have used inbuilt grok patterns to split the logs in to fields. But the field mappings are auto-generated by logstash elasticsearch plugin and I am unable to customize them.

For instance, I create a new field by name "dst-geoip" using logstash config file (see below):

geoip {
  database => "/usr/local/share/GeoIP/GeoLiteCity.dat" ### Change me to location of GeoLiteCity.dat file
    source => "dst_ip"
    target => "dst_geoip"
    fields => [ "ip", "country_code2", "country_name", "latitude", "longitude","location" ]
    add_field => [ "coordinates", "%{[dst_geoip][latitude]},%{[geoip][longitude]}" ]
    add_field => [ "dst_country", "%{[dst_geoip][country_code2]}"]
    add_field => [ "flow_dir", "outbound" ]
}

I want to assign it the type "geo_point" which I cannot edit from Kibana. Online documents mentions manually updating the mapping on respective index using ElasticSearch APIs. But Logstash generates many indices (one per day). If I update one index, will the mapping stay the same in future indices?

1

1 Answers

3
votes

What you're looking for is a "template".