10
votes

I'm actually using node-bunyan to manage log information through elasticsearch and logstash and I m facing a problem.

In fact, my log file has some informations, and fills great when I need it.

The problem is that elastic search doesn't find anything on

http://localhost:9200/logstash-*/

I have an empty object and so, I cant deliver my log to kibana.

Here's my logstash conf file :

input {
    file {
        type => "nextgen-app"
        path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
        codec => "json"
    }   
}

output {

  elasticsearch {
 host => "localhost"
 protocol => "http"
 }

}

And my js code :

log = bunyan.createLogger({
      name: 'myapp',
      streams: [
        {
          level: 'info',
          path: './app/logs/nextgen-info-log.log'
        },
        {
          level: 'error',
          path: './app/logs/nextgen-error-log.log'
        }
      ]
    })

router.all('*', (req, res, next)=>
      log.info(req.url)
      log.info(req.method)
      next()
    )

NB : the logs are well written in the log files. The problem is between logstash and elasticsearch :-/

EDIT : querying http://localhost:9200/logstash-*/ gives me "{}" an empty JSON object Thanks for advance

2
what is the output of http://localhost:9200/_cat/indices?v ?VF_
It returns only : "health status index pri rep docs.count docs.deleted store.size pri.store.size "mfrachet
alright.. which version of ES are you using? Also, please restart the logstash agent and check its logs for any errors or warnings..VF_
The version 1.5.1. I restarted logstash(v1.4.2), and no problems in the log filesmfrachet
try adding a message manually using logstash -e 'input { stdin { } } output { elasticsearch { host => localhost } }' and typing something. this should add an index to ES which you can verify by using the command above or by typing http://localhost:9200/_search?prettyVF_

2 Answers

2
votes

Here is how we managed to fix this and other problems with Logstash not processing files correctly on Windows:

  1. Install the ruby-filewatch patch as explained here: logstash + elasticsearch : reloads the same data

  2. Properly configure the Logstash input plugin:

    input {
      file {
          path => ["C:/Path/To/Logs/Directory/*.log"]
          codec => json { }
          sincedb_path => ["C:/Path/To/Config/Dir/sincedb"]
          start_position => "beginning"
      }
    }
    
    ...
    

"sincedb" keeps track of your log files length, so it should have one line per log file; if not, then there's something else wrong.

Hope this helps.

2
votes

Your output scope looks not complete. Here's the list of the output parameters http://logstash.net/docs/1.4.2/outputs/elasticsearch

Please, try:

input {
    file {
        type => "nextgen-app"
        path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
        codec => "json"
    }   
}

output {
    elasticsearch {
        host => "localhost"
        port => 9200
        protocol => "http"
        index => "logstash-%{+YYYY.MM.dd}"
    }
}

Alternatively, you can try the transport protocol:

output {
    elasticsearch {
        host => "localhost"
        port => 9300
        protocol => "transport"
        index => "logstash-%{+YYYY.MM.dd}"
    }
}

I also recommend using Kibana as a data viewer. You can download it at https://www.elastic.co/downloads/kibana