9
votes

We are installing ELS and Kibana for log aggregation/analysis. The first system to use it is greenfield so we output structured logs from the services that make up our system. Given that we don't need to add structure to our logs I was planning on using FileBeat to ship the logs directly to ELS and not use LogStash. Is this a sensible option or does LogStash have value over and above parsing that we might need? If we do use LogStash can I use that to harvest log files or should I still use FileBeat to pump the logs to LogStash?

2

2 Answers

12
votes

Logstash is useful if you need to aggregate logs from many servers and apply some common transformations and filtering to your events.

If your log events are already structured and you are ok with indexing them directly, then you can definitely have Filebeat send them directly to ES. If ES goes down (e.g. for maintenance), Filebeat will retry until it can successfully send the events.

5
votes

Is this a sensible option or does LogStash have value over and above parsing that we might need?

Deciding to use Logstash or not, in your case, depends on whether you need to treat the logs before inserting them into ES.

In addition to parsing (which is apparently useless in your use case), you can use Logstash to add a location with the geoip filter, parse a date with the date filter, replace a word with another, replace a field with a hash, etc...

You can have a look at the available filters here.

If we do use LogStash can I use that to harvest log files or should I still use FileBeat to pump the logs to LogStash?

If you need Logstash and can afford to run it on the machine where your logs are, you can avoid using Filebeat, by using the file input.

But keep in mind that Logstash, especially if used for parsing, can consume a lot of resources. It is better to have it on another machine and use Filebeat to pump the logs to Logstash.