0
votes

I am trying to use ELK space to collect file logs. Everything is OK untill filebeat integration. I can send the logs over tcp to logstash and see in the kibana. But I couldn't achieved filebeat setup to send the logs. It seams sending the data but can not see at elasticsearch.

Using this command to create elasticsearch.

docker run -d -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" --name elasticsearch docker.elastic.co/elasticsearch/elasticsearch:7.5.2

Kibana:

docker run -d -p 5601:5601 -h kibana --name kibana --link elasticsearch:elasticsearch docker.elastic.co/kibana/kibana:7.5.2

Logstash:

docker run -d -p 5044:5044 -p 5000:5000 -h logstash --name logstash --link elasticsearch:elasticsearch -v c:/elk2/config-dir:/config-dir docker.elastic.co/logstash/logstash:7.5.2 -f /config-dir/logstash.conf

Logstash.conf file

input {
    beats {
        type => "test"
        port => "5044"
    }
}

filter {
  #If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
  if [message] =~ "\tat" {
    grok {
      match => ["message", "^(\tat)"]
      add_tag => ["stacktrace"]
    }
  }

}

output {

  stdout {
    codec => rubydebug
  }

  # Sending properly parsed log events to elasticsearch
  elasticsearch {
    hosts => ["elasticsearch:9200"]
  }
}

Running filebeat on windows 10 machine. Downloaded zip and filebeat.yml config

filebeat.modules:
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - C:/elk2/filebeat/log/*.log
  multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after

output:
  logstash:
    hosts: ["localhost:5044"]
#Also tried 127.0.0.1/logstash/ip... as hosts here

Running powershell as admin mode and first

./install-service-filebeat.ps1

then

./filebeat.exe -c ./filebeat.yml

2020-01-26T22:28:45.652+0300    INFO    log/harvester.go:251    Harvester started for file: C:\elk2\filebeat\log\logstash-mehmet.log
2020-01-26T22:29:15.651+0300    INFO    [monitoring]    log/log.go:145  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":250,"time":{"ms":250}},"total":{"ticks":343,"time":{"ms":343},"value":343},"user":{"ticks":93,"time":{"ms":93}}},"handles":{"open":664},"info":{"ephemeral_id":"46f26124-44e5-4733-a259-4bed65d07a05","uptime":{"ms":32977}},"memstats":{"gc_next":9518416,"memory_alloc":6349856,"memory_total":10791408,"rss":39120896},"runtime":{"goroutines":28}},"filebeat":{"events":{"added":2,"done":2},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"logstash"},"pipeline":{"clients":1,"events":{"active":0,"filtered":2,"total":2}}},"registrar":{"states":{"current":1,"update":2},"writes":{"success":2,"total":2}},"system":{"cpu":{"cores":8}}}}}

But no result at elastic search.

2

2 Answers

0
votes

Try removing inverted commas in the port

input {
    beats {
        type => "test"
        port => 5044
    }
}
0
votes

Change the input part in logstash.conf and try again,

 input {
  beats { port => 5044 }
      }