I have a single host containing multiple log files. 2 such log files are csab and oneapplogs.
The corrseponding Groks are
opeapplog :
match => {"message" => "\[%{WORD:fixed}\|%{DATA:time}\|%{WORD:word1}\|%{WORD:word2}\|%{WORD:num1}\|%{WORD:num2}\|%{WORD:loglevel}\|%{WORD:num3}\]%{GREEDYDATA:message}"}
csab :
match => {"message" => "\[%{NONNEGINT:fixed}\|%{DATA:time}\|%{WORD:word1}\|%{NONNEGINT:num1}\|%{NONNEGINT:num2}]\[%{USERNAME:word2}\:%{NONNEGINT:num3}\] %{WORD:word1} : %{USERNAME:fixed} = %{NONNEGINT:num5}"}
When I try to send both to Elasticsearch through logstash, being different logs, I have 2 separate logstash conf files for both with different ports for input from filebeats.
I am not able to run the different filebeats at the same. I've read that to do so, I will need to configure completely independent filebeat instances in the machine but I have over 60 logs so setting up 60 instances seems challenging.
Is there any way to send all the files through 1 filebeats instance to logstash and then use logstash to process all logs into multiple different outputs for elasticsearch.
Simpler explanation
I have 2 logs in the same machine. I need to process them using ELK
I have configured one logstash service having two pipelines, both pipelines separate ports are given. Let's say Pipeline1 (Port 5044) , Pipeline2 (Port 5045)
Now i want to send data to the logstash using filebeat. So i have two types of log file let's say log1, log2.
I want to send log1 to Pipeline1 and log2 to Pipeline 2.
Is this possible with just 1 filebeats instance.
Or is there some other workaround for processing different logs from the same host?
Any help or suggestions would be really appreciated. Thanks a lot !!