0
votes

I am learning ElasticSearch and have hit a block. I am trying to use logstash to load a simple CSV into ElasticSearch. This is the data, it is a postcode, longitude, latitude

ZE1 0BH,-1.136758103355,60.150855671143
ZE1 0NW,-1.15526666950369,60.1532197533966

I am using the following logstash conf file to filter the CSV to create a "location" field

input {
  file {
      path => "postcodes.csv"
      start_position => "beginning"
      sincedb_path => "/dev/null"
  }
}

filter {
    csv {
        columns => ["postcode", "lat", "lon"]
        separator => ","
    }

    mutate { convert => {"lat" => "float"} }
    mutate { convert => {"lon" => "float"} }
    mutate { rename => {"lat" => "[location][lat]"} }
    mutate { rename => {"lon" => "[location][lon]"} }
    mutate { convert => { "[location]" => "float" } }
}

output {

    elasticsearch {
      action => "index"
      hosts => "localhost"
      index => "postcodes"
    }
    stdout { codec => rubydebug }
}

And I have added the mapping to ElasticSearch using the console in Kibana

PUT postcodes
  {
    "settings": {
      "number_of_shards": 1
    },
    "mappings": {
      "feature": {
        "_all":       { "enabled": true  }, 
        "properties": {
          "postcode": {"type": "text"},
          "location": {"type": "geo_point"}
        }
      }
    }
  }

I check the mappins for the index using

GET postcodes/_mapping

{
  "postcodes": {
    "mappings": {
      "feature": {
        "_all": {
          "enabled": true
        },
        "properties": {
          "location": {
            "type": "geo_point"
          },
          "postcode": {
            "type": "text"
          }
        }
      }
    }
  }
}

So this all seems to be correct having looked at the documentation and the other questions posted.

However when i run

bin/logstash -f postcodes.conf

I get an error:

[location] is defined as an object in mapping [logs] but this name is already used for a field in other types

I have tried a number of alternative methods;

Deleted the index and the create a template.json and changed my conf file to have the extra settings:

manage_template => true
template => "postcode_template.json"
template_name =>"open_names"
template_overwrite => true

and this gets the same error.

I have managed to get the data loaded by not supplying a template however the data never gets loaded in as a geo_point so you cannot use the Kibana Tile Map to visualise the data

Can anyone explain why I am receiving that error and what method I should use?

1
Why do you have mutate { convert => { "[location]" => "float" } } -- that seems to be your issue.Alcanzar
@Alcanzar I have removed that and I still get the same 'is definied as an object" error. ANything else I could try? Thankstjmgis

1 Answers

0
votes

Your problem is that you don't have a document_type => feature on your elasticsearch output. Without that, it's going to create the object on type logs which is why you are getting this conflict.