1
votes

I am trying to setup ELK stack on my ubuntu Sandbox and struck with some issue. The issue is Logstash is not sending data to Elasticsearch. I referred the Elasticsearch documentation.

Looks like Kibana and Elasticsearch connectivity is working fine, I think what Kibana is reporting is it can't find the data. Spent couple of hours to figure out but no luck...

Appreciate any to fix this issue. Thank you very much!

Here are my setup details,

Logstash Setup:

sirishg@sirishg-vm:/u02/app/logstash-2.1.1/bin$ ./logstash -f /u02/app/logstash-2.1.1/first-pipeline.conf 
Settings: Default filter workers: 1
Logstash startup completed

first-pipeline.conf:

        # The #  character at the beginning of a line indicates a comment. Use comments to describe your configuration.
    input {
        file {
            path => "/u02/app/logstash-tutorial-dataset.log"
            start_position => beginning
        }
    }
    filter {
        grok {
            match => { "message" => "%{COMBINEDAPACHELOG}"}
        }
        geoip {
            source => "clientip"
        }
    }
    output {
        elasticsearch {
         hosts => ["localhost:9200"]
        }
        stdout {
         codec => rubydebug
        }
    }

Elastic Search Setup:

Health Check Report:

{"cluster_name":"my-application","status":"yellow","timed_out":false,"number_of_nodes":1,"number_of_data_nodes":1,"active_primary_shards":1,"active_shards":1,"relocating_shards":0,"initializing_shards":0,"unassigned_shards":1,"delayed_unassigned_shards":0,"number_of_pending_tasks":0,"number_of_in_flight_fetch":0,"task_max_waiting_in_queue_millis":0,"active_shards_percent_as_number":50.0}

Startup Logs:

sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-16 18:17:36,591][INFO ][node                     ] [node-1] version[2.1.1], pid[3596], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-16 18:17:36,594][INFO ][node                     ] [node-1] initializing ...
[2016-01-16 18:17:36,798][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-16 18:17:36,907][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-16 18:17:43,349][INFO ][node                     ] [node-1] initialized
[2016-01-16 18:17:43,350][INFO ][node                     ] [node-1] starting ...
[2016-01-16 18:17:43,693][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-16 18:17:43,713][INFO ][discovery                ] [node-1] my-application/8bfTdwZcSzaNC9_P2VYYvw
[2016-01-16 18:17:46,878][INFO ][cluster.service          ] [node-1] new_master {node-1}{8bfTdwZcSzaNC9_P2VYYvw}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-16 18:17:46,980][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-16 18:17:46,991][INFO ][node                     ] [node-1] started
[2016-01-16 18:17:47,318][INFO ][gateway                  ] [node-1] recovered [1] indices into cluster_state
[2016-01-16 18:20:03,866][INFO ][rest.suppressed          ] /logstash-*/_mapping/field/* Params: {ignore_unavailable=false, allow_no_indices=false, index=logstash-*, include_defaults=true, fields=*, _=1452986403826}
[logstash-*] IndexNotFoundException[no such index]
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:636)
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:133)
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:77)
    at org.elasticsearch.action.admin.indices.mapping.get.TransportGetFieldMappingsAction.doExecute(TransportGetFieldMappingsAction.java:57)
    at org.elasticsearch.action.admin.indices.mapping.get.TransportGetFieldMappingsAction.doExecute(TransportGetFieldMappingsAction.java:40)
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:70)
    at org.elasticsearch.client.node.NodeClient.doExecute(NodeClient.java:58)

Kibana Status:

sirishg@sirishg-vm:/u02/app/kibana-4.3.1-linux-x86/bin$ ./kibana 
  log   [18:18:36.697] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
  log   [18:18:36.786] [info][status][plugin:elasticsearch] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [18:18:36.852] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
  log   [18:18:36.875] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.883] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
  log   [18:18:36.907] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
  log   [18:18:36.936] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
  log   [18:18:36.950] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
  log   [18:18:37.078] [info][listening] Server running at http://0.0.0.0:5601
  log   [18:18:37.446] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready

Kibana UI Errors:

Error: Please specify a default index pattern
KbnError@http://localhost:5601/bundles/commons.bundle.js:58172:21
NoDefaultIndexPattern@http://localhost:5601/bundles/commons.bundle.js:58325:6
loadDefaultIndexPattern/<@http://localhost:5601/bundles/kibana.bundle.js:97911:1
processQueue@http://localhost:5601/bundles/commons.bundle.js:42358:29
scheduleProcessQueue/<@http://localhost:5601/bundles/commons.bundle.js:42374:28
$RootScopeProvider/this.$get</Scope.prototype.$eval@http://localhost:5601/bundles/commons.bundle.js:43602:17
$RootScopeProvider/this.$get</Scope.prototype.$digest@http://localhost:5601/bundles/commons.bundle.js:43413:16
$RootScopeProvider/this.$get</Scope.prototype.$apply@http://localhost:5601/bundles/commons.bundle.js:43710:14
$LocationProvider/this.$get</<@http://localhost:5601/bundles/commons.bundle.js:39839:14
jQuery.event.dispatch@http://localhost:5601/bundles/commons.bundle.js:22720:16
jQuery.event.add/elemData.handle@http://localhost:5601/bundles/commons.bundle.js:22407:7

Logstash debug logs:

  {:timestamp=>"2016-01-17T11:07:06.287000-0500", :message=>"Reading config file", :config_file=>"/u02/app/logstash-2.1.1/first-pipeline.conf", :level=>:debug, :file=>"logstash/agent.rb", :line=>"325", :method=>"local_config"}
{:timestamp=>"2016-01-17T11:07:06.420000-0500", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_file_1 = plugin(\"input\", \"file\", LogStash::Util.hash_merge_many({ \"path\" => (\"/u02/app/logstash-tutorial-dataset.log\") }, { \"start_position\" => (\"beginning\") }))\n\n          @inputs << @input_file_1\n\n          @filter_grok_2 = plugin(\"filter\", \"grok\", LogStash::Util.hash_merge_many({ \"match\" => {(\"message\") => (\"%{COMBINEDAPACHELOG}\")} }))\n\n          @filters << @filter_grok_2\n\n            @filter_grok_2_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2)\n\n              events = @filter_grok_2.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_grok_2, :events => events)\n\n                          events = @filter_geoip_3.multi_filter(events)\n  \n\n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_grok_2.respond_to?(:flush)\n              @periodic_flushers << @filter_grok_2_flush if @filter_grok_2.periodic_flush\n              @shutdown_flushers << @filter_grok_2_flush\n            end\n\n          @filter_geoip_3 = plugin(\"filter\", \"geoip\", LogStash::Util.hash_merge_many({ \"source\" => (\"clientip\") }))\n\n          @filters << @filter_geoip_3\n\n            @filter_geoip_3_flush = lambda do |options, &block|\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3)\n\n              events = @filter_geoip_3.flush(options)\n\n              return if events.nil? || events.empty?\n\n              @logger.debug? && @logger.debug(\"Flushing\", :plugin => @filter_geoip_3, :events => events)\n\n                \n\n              events.each{|e| block.call(e)}\n            end\n\n            if @filter_geoip_3.respond_to?(:flush)\n              @periodic_flushers << @filter_geoip_3_flush if @filter_geoip_3.periodic_flush\n              @shutdown_flushers << @filter_geoip_3_flush\n            end\n\n          @output_elasticsearch_4 = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"localhost:9200\")] }))\n\n          @outputs << @output_elasticsearch_4\n\n          @output_stdout_5 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_5\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n              events = @filter_grok_2.multi_filter(events)\n              events = @filter_geoip_3.multi_filter(events)\n    \n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_elasticsearch_4.handle(event)\n    @output_stdout_5.handle(event)\n    \n  end", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"38", :method=>"initialize"}
{:timestamp=>"2016-01-17T11:07:06.426000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"file", :path=>"logstash/inputs/file", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.451000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.465000-0500", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.468000-0500", :message=>"config LogStash::Inputs::File/@path = [\"/u02/app/logstash-tutorial-dataset.log\"]", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.469000-0500", :message=>"config LogStash::Inputs::File/@start_position = \"beginning\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.472000-0500", :message=>"config LogStash::Inputs::File/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.480000-0500", :message=>"config LogStash::Inputs::File/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.481000-0500", :message=>"config LogStash::Inputs::File/@stat_interval = 1", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.492000-0500", :message=>"config LogStash::Inputs::File/@discover_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.493000-0500", :message=>"config LogStash::Inputs::File/@sincedb_write_interval = 15", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.496000-0500", :message=>"config LogStash::Inputs::File/@delimiter = \"\\n\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.498000-0500", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"filter", :name=>"grok", :path=>"logstash/filters/grok", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2016-01-17T11:07:06.515000-0500", :message=>"config LogStash::Filters::Grok/@match = {\"message\"=>\"%{COMBINEDAPACHELOG}\"}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.524000-0500", :message=>"config LogStash::Filters::Grok/@add_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.532000-0500", :message=>"config LogStash::Filters::Grok/@remove_tag = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.535000-0500", :message=>"config LogStash::Filters::Grok/@add_field = {}", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}
{:timestamp=>"2016-01-17T11:07:06.536000-0500", :message=>"config LogStash::Filters::Grok/@remove_field = []", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"122", :method=>"config_init"}

Elasticsearch Recent logs:

    sirishg@sirishg-vm:/u02/app/elasticsearch-2.1.1/bin$ ./elasticsearch
[2016-01-17 11:00:23,467][INFO ][node                     ] [node-1] version[2.1.1], pid[3418], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-17 11:00:23,470][INFO ][node                     ] [node-1] initializing ...
[2016-01-17 11:00:23,698][INFO ][plugins                  ] [node-1] loaded [], sites []
[2016-01-17 11:00:23,853][INFO ][env                      ] [node-1] using [1] data paths, mounts [[/ (/dev/sda1)]], net usable_space [12.6gb], net total_space [45.1gb], spins? [possibly], types [ext4]
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] initialized
[2016-01-17 11:00:27,412][INFO ][node                     ] [node-1] starting ...
[2016-01-17 11:00:27,605][INFO ][transport                ] [node-1] publish_address {localhost/127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2016-01-17 11:00:27,616][INFO ][discovery                ] [node-1] my-application/rd4S1ZOdQXOj3_g-N22NnQ
[2016-01-17 11:00:31,121][INFO ][cluster.service          ] [node-1] new_master {node-1}{rd4S1ZOdQXOj3_g-N22NnQ}{127.0.0.1}{localhost/127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2016-01-17 11:00:31,259][INFO ][http                     ] [node-1] publish_address {localhost/127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2016-01-17 11:00:31,260][INFO ][node                     ] [node-1] started
[2016-01-17 11:00:31,830][INFO ][gateway                  ] [node-1] recovered [2] indices into cluster_state
1
Sorry forgot to provide version information, Here it is,Kibana version Version 4.3.1, Loststash Version 2.1.1 & Elasticsearch Version 2.1.1GSR
Is there any output in logstash console? If you have ever processed /u02/app/logstash-tutorial-dataset.log, you may need to delete logstash .sincedb files (ls ~/.sincedb*).longhua
Can you run logstash with the --debug flag and show some relevant log?Val
@longhua yes I found one .sincedb file and I am gonna remove it as you said. /home/sirishg/.sincedb_9dbbe1dd488b6b538eecc653ee954022. Let me retry will post the updates.GSR
@Val I have attached logstash debug log in my original post. Kindly suggest.GSR

1 Answers

0
votes

Have you been able to get this to work? Some comments:

1) The fact that you have kibana running on "0.0.0.0" sometimes is a sign of something going wrong, check configuration and connectivity with elasticsearch.

2) what index are you putting the information into? logstash*?

3) If everything else fails, update to current 2.3.* (Elasticsearch) and 4.4.* (Kibana).

4) In order to have logstash actually capture the file and read it (and therefore send the data to Elasticsearch) you should write the file again (change file creation / modification timestamps). That part does not always comes easy, as logstash (the file input-input) actually keeps like a pointer to the last line added to the file, or something.

You probably got it working by now, so maybe I am blowing in the wind, but on the other hand maybe this can help someone.