I'm having an issue with filebeat on windowsXP. There is a csv file that is written (every 10/15 seconds) by an application that collect data measurement. Filebeat reads this file every time from scratch and it causes duplicated rows. Any advices?
filebeat.prospectors:
- input_type: log
paths:
- C:\nms2k\ems\measure\PERF*.csv
include_lines: ['RTT','RTJ']
logging.level: debug
#----------------------------- Logstash output --------------------------------
output.logstash:
hosts: ["xxx.xxx.xxx.xxx:1000"]
Here my registry.log: https://pastebin.com/MbPVgH5S
and here my filebeat.log: https://pastebin.com/A8tqukQT