I am trying to stream logs from logstash to elasticsearch (5.5.0). I am using filebeat to send logs to logstash.
I have not defined any index; it is defined automatically (say "test1") when data is pushed for the first time.
Now, I want to create another index ("test2") so that I can manage field data types. For that, I got the mappings for test1. Updated the index name. And did PUT call for test2 with this data. However, it fails with following result:
`ubuntu@elasticsearch:~$ curl -XPUT 'localhost:9200/test2?pretty' -H 'Content-Type: application/json' -d'@/tmp/mappings_test.json'
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "unknown setting [[email protected]] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
}
],
"type" : "illegal_argument_exception",
"reason" : "unknown setting [[email protected]] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
},
"status" : 400
}
`
Following is the excerpt of the json which I am using. `
{
"test2" : {
"mappings" : {
"log" : {
"properties" : {
"@timestamp" : {
"type" : "date"
},
"@version" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"accept_date" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
....
`
I modified index name only. Rest of the content is same as mapping of test1 index.
Any help is appreciated on how to create this new index by updating types?