I am doing centralized logging using logstash. I am using logstash-forwarder on the shipper node and ELK stack on the collector node.The issue is that i want logstash to parse the file from the beginning which is present on the shipper node. The config file logstash-forwarder.conf on the shipper has following configuration :
{
"network": {
"servers": [ "XXX.XX.XX.XXX:5000" ],
"timeout": 15,
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
},
"files": [
{
"paths": [
"/apps/newlogs.txt"
],
"fields": { "type": "syslog" }
}
]
}
And the collector configuration is :
input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate}\s%{LOGLEVEL:level}\s-\s%{WORD:USE_CASE}\s:\s%{WORD:STEP_DETAIL}\s:\s\[%{WORD:XXX}\]\s:\s(?<XXX>([^\s]+))\s:\s%{GREEDYDATA:MESSAGE_DETAILS}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
add_tag => [ "level:%{level}" ]
add_tag => [ "USE_CASE:%{USE_CASE}" ]
}
}
}
output {
elasticsearch { host => localhost}
stdout { codec => rubydebug }
}
I want that the file should be parsed from the begging and not for every event generated,Which we do easily in the logstash.conf by specifying start_position => beginning but i am unable to find a straightforward way in logstash-forwarder as the file will be present on the shipper side.
Thanks.