We want to get logs using logstash and pass them to kafka.
We have written the following conf file for logstash 1.5.0 beta 1 and kafka 2.9.2_0.8.1.1
**
input {
file {
type => "apache"
path => ["/var/log/apache2/access.log", "/var/log/apache2/error.log"]
}
}
output {
kafka {
codec => plain {
format => "%{message}"
}
topic_id => "example1"
}
}
**
After running the following command : bin/logstash agent -f test.conf --log ex.log
test.conf is our conf file. ex.log is a blank file we have created for the logs to be stored.
we are getting the following output
Sending logstash logs to ex.log. Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.5.0.beta1/plugin-milestones {:level=>:warn} Using milestone 1 output plugin 'kafka'. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. For more information on plugin milestones, see http://logstash.net/docs/1.5.0.beta1/plugin-milestones {:level=>:warn} log4j:WARN No appenders could be found for logger (kafka.utils.VerifiableProperties). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http: //logging.apache.org/log4j/1.2/faq.html#noconfig for more info. SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http:/ /www.slf4j.org/codes.html#StaticLoggerBinder for further details.
We tried setting the CLASSPATH in bashrc. Did not work. Please tell us where we are going wrong. Thankyou in advance!