0
votes

I have a set of data inside dynamodb that I am importing to AWS Elasticsearch using this tutorial: https://medium.com/@vladyslavhoncharenko/how-to-index-new-and-existing-amazon-dynamodb-content-with-amazon-elasticsearch-service-30c1bbc91365

I need to change the mapping of a part of that data to geo_point.

I have tried creating the mapping before importing the data with:

PUT user
{
  "mappings": {
    "_doc": {
      "properties": {
        "grower_location": {
            "type": "geo_point"
        }
      }
    }
  }
}

When I do this the data doesn't import, although I don't receive an error.

If I import the data first I am able to search it, although the grower_location: { lat: #, lon: # } object is mapped as an integer and I am unable to run geo_distance.

Please help.

1
How is you DynamoDB table called? The index name will be named after your table name. Can you show the mapping that gets generated?Val

1 Answers

1
votes

I was able to fix this by importing the data once with the python script in the tutorial.

Then running

GET user/_mappings

Copying the auto generated mappings to clipboard, then,

DELETE user/

Then pasting the copied mapping to a new mapping and changing the type for the geo_point data.

PUT user/
{
   "mappings": {
     "user_type": {
       "properties": {
         ...
         "grower_location": {
           "type": "geo_point"
         }
         ...
       }
     }
   }
}

Then re-importing the data using the python script in the tutorial. Everything is imported and ready to be searched using geo_point!