3
votes

I have a flink job which uses logback as the logging framework since the logs needs to be sent to logstash and logback has a logstash appender (logstash-logback-appender). The appender works fine and i can see the application logs in logstash when the flink job is run from an IDE like Eclipse. The logging configuration file logback.xml is placed in src/main/resources and gets included on the classpath. The logging works fine even when running the job from command line outside the IDE.

However when i deploy this job on flink cluster(standalone, started using ./start-cluster.bat) through the flink dashboard, the logback configuration is ignored and the logs are not sent to logstash.

I read up more about flink's logging mechanism and came across documentation on configuring logback. The steps mentioned in this documentation works fine with some additional steps like adding logstash-logback-encoder lib in the lib/ folder along with logback jars.

Even though the steps mentioned above works this is problematic since the logback configuration in flink/conf folder which is used by flink, applies to the entire flink setup and all the jobs running on flink. The jobs cannot have their own logging configuration. For eg. i want job1 to write to file,console,logstash and job 2 to write to only file.

How can each flink job that is started from the dashboard be supplied with seperate logging configuration? Is there any way logging configuration can be passed while submitting the job on dashboard?

Is there someway to force flink to use logging configuration on the classpath?

1

1 Answers

3
votes

Flink currently does not support to specify individual logging configurations per job. The logging configuration is always valid for the whole cluster.

A way to solve this problem is to start the jobs in per-job mode. This means that you start for every Flink job a dedicated Flink cluster.

bin/flink run -m yarn-cluster -p 2 MyJobJar.jar