0
votes

I am using filebeat to get logs from the remote server and shipping it to logstash so it's working fine. But when new logs being appending in the source log file so filebeat reads those logs from the beginning and send it to logstash and logstash appending all logs with older logs in the elasticsearch even though we already have those older logs in elasticsearch so here repetition is happening of logs.

So my question is how to ship only new added log into logstash. Once new lines of log append in log file so those new files should ship to logstash to elasticsearch.

Here is my filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /home/user/Documents/ELK/locals/*.log

logstash input is logstash-input.conf

input {
  beats {
    port => 5044
  }
}
2

2 Answers

1
votes

I assume you're doing the same mistake I did for testing the filebeat a few months back (using vi editor for manually updating the log/text file). When you edit a file manually, using vi editor, it creates a new file on disk with new meta-data. Filebeat identifies the state of the file using its meta-data, not the text. Hence, reloads the complete log file.

If this is the case, try to append it to the file like this. echo "something" >> /path/to/file.txt

For more : Read this

0
votes

There seem to be a problem somewhere. Usually Filebeat is intelligent enough to save the offset, meaning that it only ships log-lines which have been added since the last crawl.

There are a few settings which could possibly interfere with that, please read up on them: - ignore_older - close_inactive (https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log-close-inactive) - close_timeout (https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log-close-timeout)

Please post your complete filebeat.yml file and also which system and which type of logs you are trying to harvest.