I want to log in maprDB a spark job with log4j. I have written a custom appender, and here my log4j.properties :
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.Target=System.out log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
log4j.appender.MapRDB=com.datalake.maprdblogger.Appender
log4j.logger.testest=WARN, MapRDB
Put on src/main/resources directory
This is my main method :
object App {
val log: Logger = org.apache.log4j.LogManager.getLogger(getClass.getName)
def main(args: Array[String]): Unit = {
// custom appender
LogHelper.fillLoggerContext("dev", "test", "test", "testest", "")
log.error("bad record.")
}
}
When I run my spark-submit without any configuration, nothing happens. It is like my log4j.properties wasn't here.
If I deploy my log4j.properties file manually and add options :
--conf spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/PATH_TO_FILE/log4j.properties
--conf spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/PATH_TO_FILE/log4j.properties
It works well. Why it doesn't work without theese options ?
src/main/resources
instead? – Jacek Laskowskilog4j.appender.stdout.Target
(uppercase) should belog4j.appender.stdout.target
(lowercase). – Jacek Laskowski-Dlog4j.debug
and see where log4j searches for the properties. – Jacek Laskowski