0
votes

When I am trying to use logstash to read through a configuration file, I come up with map parsing error.

:response=>{"index"=>{"_index"=>"logstash-2016.06.07", "_type"=>"txt", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "r eason"=>"Failed to parse mapping [default]: Mapping definition for [data] has unsupported parameters: [ignore_above : 1024]", "caused_by"=>{"type"=>"mapper_parsing_exception", "reason"=>"Mapping definition for [data] has unsupported para meters: [ignore_above : 1024]"}}}}, :level=>:warn}←[0m

I found that there is no problem is groking my logs but just do not know what is the matter of the error.

Here is my logstash.conf

input{
    stdin{}
    file{
        type => "txt"
        path => "C:\HA\accesslog\trial.log"
        start_position => "beginning"
    }
}
filter{
    grok{
    match => {"message" => ["%{IP:ClientAddr}%{SPACE}%{NOTSPACE:access_date}%{SPACE}%{TIME:access_time}%{SPACE}%{NOTSPACE:x-eap.wlsCustomLogField.VirtualHost}%{SPACE}%{WORD:cs-method}%{SPACE}%{PATH:cs-uri-stem}%{SPACE}%{PROG:x-eap.wlsCustomLogField.Protocol}%{SPACE}%{NUMBER:sc-status}%{SPACE}%{NUMBER:bytes}%{SPACE}%{NOTSPACE:x-eap.wlsCustomLogField.RequestedSessionId}%{SPACE}%{PROG:x-eap.wlsCustomLogField.Ecid}%{SPACE}%{NUMBER:x-eap.wlsCustomLogField.ThreadId}%{SPACE}%{NUMBER:x-eap.wlsCustomLogField.EndTs}%{SPACE}%{NUMBER:time-taken}"]}
        }
    if "_grokparsefailure" in [tags] {
        drop { }
    }   
}
output{
    elasticsearch { 
        hosts => ["localhost:9200"] 
        index => "logstash-%{+YYYY.MM.dd}"
        template_overwrite => true
    }
    stdout { codec => rubydebug }   
}

Please help. Thanks.

1
Please format your error message to be more readable. You may also want to trim the hostnames and potentially sensitive data out of there. - Will Barnwell
The error is a 400, failed to index. Please show your output section of your logstash config. - Will Barnwell
I have just tried to made the things more clear. Thanks for the help. - Kennedy Kan
Thanks, Did you recently upgrade Logstash or Elasticsearch? - Will Barnwell
No, I use the version I downloaded only. - Kennedy Kan

1 Answers

0
votes
  1. I turn out find out the sollution here: github.com/elastic/elasticsearch/issues/16283
  2. Another problem is the created field for indexing is too long. Shortening the name can solve the issue.