I am using filebeat to get logs from the remote server and shipping it to logstash so it's working fine. But when new logs being appending in the source log file so filebeat reads those logs from the beginning and send it to logstash and logstash appending all logs with older logs in the elasticsearch even though we already have those older logs in elasticsearch so here repetition is happening of logs.
So my question is how to ship only new added log into logstash. Once new lines of log append in log file so those new files should ship to logstash to elasticsearch.
Here is my filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/user/Documents/ELK/locals/*.log
logstash input is logstash-input.conf
input {
beats {
port => 5044
}
}