I've setup logstash 1.5.0 with elasticsearch 1.5.1. Kibana is up and running the logstash interface via nginx.
However it seems that logstash is not creating an elastic search index.
This is what I get when I try to curl the indexes from elasticsearch:
[root@aoadbld00032lb ~]# curl -s http://127.0.0.1:9200/_status?pretty=true
{
"_shards" : {
"total" : 0,
"successful" : 0,
"failed" : 0
},
"indices" : { }
}
And this is what I'm seeing in the logstash logs:
{:timestamp=>"2015-05-17T16:45:08.435000-0400", :message=>"Using version 0.1.x input plugin 'tcp'. This plugin isn't well supporte
d by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.449000-0400", :message=>"Using version 0.1.x codec plugin 'line'. This plugin isn't well support
ed by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.458000-0400", :message=>"Using version 0.1.x input plugin 'udp'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.462000-0400", :message=>"Using version 0.1.x codec plugin 'plain'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.474000-0400", :message=>"Using version 0.1.x filter plugin 'grok'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.482000-0400", :message=>"Using version 0.1.x filter plugin 'syslog_pri'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.500000-0400", :message=>"Using version 0.1. x filter plugin 'date'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.510000-0400", :message=>"Using version 0.1.x filter plugin 'mutate'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:08.808000-0400", :message=>"Using version 0.1.x output plugin 'elasticsearch'. This plugin isn't well supported by the community and likely has no maintainer.", :level=>:info}
{:timestamp=>"2015-05-17T16:45:09.781000-0400", :message=>"Starting tcp input listener", :address=>"0.0.0.0:5000", :level=>:info}
{:timestamp=>"2015-05-17T16:45:09.807000-0400", :message=>"Starting UDP listener", :address=>"0.0.0.0:5000", :level=>:info}
I'm thinking the output in bold might be important!
This is my logstash.conf file:
[root@aoadbld00032lb ~]# cat /etc/logstash/logstash.conf
input {
tcp {
port => 5000
type => syslog
}
udp {
port => 5000
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOG5424PRI}%{NONNEGINT:syslog5424_ver} +(?:%{TIMESTAMP_ISO8601:syslog5424_ts}|-) +(?:%{HOSTNAME:syslog5424_host}|-) +(?:%{NOTSPACE:syslog5424_app}|-) +(?:%{NOTSPACE:syslog5424_proc}|-) +(?:%{WORD:syslog5424_msgid}|-) +(?:%{SYSLOG5424SD:syslog5424_sd}|-|) +%{GREEDYDATA:syslog5424_msg}" }
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
if !("_grokparsefailure" in [tags]) {
mutate {
replace => [ "@source_host", "%{syslog_hostname}" ]
replace => [ "@message", "%{syslog_message}" ]
}
}
mutate {
remove_field => [ "syslog_hostname", "syslog_message", "syslog_timestamp" ]
}
}
}
output {
elasticsearch {
host => "127.0.0.1"
embedded => false
cluster => "optl_elasticsearch"
}
}
Can I please get some advice on how to get this logstash setup indexing in elastic search?
host
parameter to theelasticsearch
output plugin must be an array. Can you tryhost => ["127.0.0.1"]
? Also why don't you use thesyslog
input plugin? – Val