2
votes

I am currently running ELK stack in docker (https://github.com/deviantony/docker-elk) and have a standalone java application that I am trying to send logs from to logstash using log4j SocketAppender. When I view my logs in Kibana the messages appear to be encoded incorrectly. I am very new to the ELK stack and I've tried a lot of different solutions I've found on here, but nothing I try seems to work. Thanks in advance for any help.

logstash.conf:

input {
    log4j {
        mode => "server"
        host => "0.0.0.0"
        port => 5000
        type => "log4j"
      }
}

## Add your filters / logstash plugins configuration here

filter {
    # All lines that does not start with %{TIMESTAMP} or ' ' + %{TIMESTAMP} belong to the previous event
    multiline {
       pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-"
       negate => true
       what => "previous"
    }
}

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}

log4j.properties:

log4j.rootLogger=info,tcp

log4j.appender.tcp=org.apache.log4j.net.SocketAppender
log4j.appender.tcp.Port=5000
log4j.appender.tcp.RemoteHost=localhost
log4j.appender.tcp.ReconnectionDelay=10000
log4j.appender.tcp.Application=hello-world
log4j.appender.myappender.encoding=UTF-8

Kibana log: enter image description here

1

1 Answers

1
votes

Turns out this was related to being on a windows environment. Running from a linux environment solved the encoding issue. I'm not sure if there is a way to solve the encoding problem while on windows...

The proper logstash config for multiline support for tcp that worked for me:

input {
    log4j {
        mode => "server"
        host => "0.0.0.0"
        port => 5000
        type => "log4j"
        codec => multiline {
            pattern => "^\s"
            what => "previous"
        }
      }
}

## Add your filters / logstash plugins configuration here

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}