There are 1000 records fetch from Database and formatted in to JSON. I need to insert each one as a document in Elasticsearch and autocomplete_entities
index. The input file looks like this
[
{
"id" : 1,
"title" : "x"
},
{
"id" : 2,
"title" : "y"
},
...
]
I have these settings too
PUT _settings
{
"index.mapping.total_fields.limit" : 10000
}
and
put autocomplete_entities
Now when I run this code
elasticdump --bulk=true --input="PycharmProjects/untitled/v1.json" --output="http://localhost:9200/_doc" --output-index="autocomplete_entities" --type=data --transform="doc._source=Object.assign({},doc)"
It creates one document with 1000 fields such that each fields contains id
, title
, as follows
"took": 9,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 1,
"hits": [
{
"_index": "autocomplete_entities",
"_type": "autocomplete_entities",
"_id": "voVp7XQBq56KEuWZ8JuQ",
"_score": 1,
"_source": {
"0": {
"id": 37905,
"title": "x"
},
"1": {
"id": 44093,
"title": "y"
} ...
}
}
]
}
How can I import 1000 records in elasticsearch as 1000 documents ?