I have set a version 5 ELK stack (with X-Pack) and I use the following logstash conf file which seems to grok parse the access log correctly (as I see it in output):
input {
file {
path => ["/path/to/access_log"]
start_position => "beginning"
}
}
filter {
mutate { replace => { "type" => "apache_access" } }
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
user => "elastic"
password => "*****"
}
stdout { codec => rubydebug }
}
However the elasticsearch output returns 404 ("no such index") error whatever I do except when I tried to create the index by hand using the following curl command:
curl -XPUT 'elastic:******@localhost:9200/logstash-2016.09.28
[2016-11-02T17:30:09,535][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>404, :action=>["index", {:_id=>nil, :_index=>"logstash-2016.09.28", :_type=>"apache_access", :_routing=>nil}, 2016-09-28T11:50:32.000Z mypchost 10.2.33.155 - - [28/Sep/2016:14:50:32 +0300] "GET /MyApp/page HTTP/1.1" 302 - "http://myhttpserver/MyApp/page?22" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:35.0) Gecko/20100101 Firefox/35.0" ], :response=>{"index"=>{"_index"=>"logstash-2016.09.28", "_type"=>"apache_access", "_id"=>nil, "status"=>404, "error"=>{"type"=>"index_not_found_exception", "reason"=>"no such index", "resource.type"=>"index_expression", "resource.id"=>"logstash-2016.09.28", "index_uuid"=>"na", "index"=>"logstash-2016.09.28"}}}}
What could be the issue with not automatically creating the index?
EDIT: Uninstalling the X-Pack from ES and re-running the logstash did the job, i.e. created an index. Now I need to find out what is the problem with X-Pack.