0
votes

I want to setup elasticsearch with kibana and fluentd. I have two machines. Machine 1 is working as a source of log files. Logs are created with every http request and every request is stored into seperate log file. Log path strcuture is like /path/year/month/day/hour/*.log. I want to use elasticsearch with kibana which is installed on machine 2.

I've read something about fluentd forwarder and agregator, I've also read about forward Output Plugin. But still I'm not sure how to setup this machines. Usually forwarder is used to send single log file but I have multiple files.

Can you please point me to right direction how to setup fluentd on both machines so I could browse with kibana on machine 2 logs from machine 1.

Format of one log file:

IP Adress: xxx.xxx.xxx.xxx
Time: xx:xx:xx xxxx/xx/xx
Data: <xml>....</xml>

Any help would be much appriciated.

Thanks

1
What does each line of log on machine 1 look like? Do logs themselves have timestamps?Kiyoto Tamura
Hi kioto, thx for reply, I've just put format of single log file into post.Petr Velký

1 Answers

0
votes

I am researching about Bigdata also and have a idea for your question. I think you just use this (http://docs.fluentd.org/articles/in_tail) for machine 1, it will get log and push to machine 2 which you install this plugin (https://github.com/uken/fluent-plugin-elasticsearch) on. Hope it will be helpful!