0
votes

I did configure the Elastic Stack (Logstash + Elastic search + Kibana ) with filebeat. So my question is I have multiple servers where I deployed my application instances (Microservices applications ). I want to capture logs from all the servers but for that I have to install filebeat in each server. Is it the correct understanding ? or Can we configure something like that single filebeat instance able to fetch logs from all the servers (Servers can be same network) and send logs over TCP or any protocol ?

2

2 Answers

0
votes

Yes you will have to deploy filebeat on all the servers from where you wish to scrap the logs.

0
votes

Another option is to configure your logstash to listen on a TCP port and then configure your applications to log to a socket instead of a file.

input {
  tcp {
    port => 8192
    codec => json
    tags => [ "micrologs" ]
  }
}

This sets up a listener on the Logstash box on port 8192. Logs arrive one at a time, with a connection each time, formatted in JSON.

input {
  tcp {
    port => 8192
    codec => json_lines
    tags => [ "micrologs" ]
  }
}

This does the same, except the connection is persistent, and the json_lines codec is used to break up log-events based on the lines of JSON in the incoming connection.

You don't have to use json here, it can be plain text if you need it. I used JSON as an example of structured log.