0
votes

I'm trying to use logback for logger in spark streaming. While I'm trying to submit job through spark-submit I'm getting exception as below.

Exception in thread "main" java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerFactory cannot be cast to ch.qos.logback.classic.LoggerContext at consumer.spark.LogBackConfigLoader.(LogBackConfigLoader.java:18) at consumer.spark.Sample.main(Sample.java:18) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743) at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:169) at org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:167) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

my pom.xml is :

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <slf4j.version>1.6.1</slf4j.version>
    <logback.version>1.2.3</logback.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
        <version>${slf4j.version}</version>
    </dependency>
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-classic</artifactId>
        <version>${logback.version}</version>
    </dependency>
    <dependency>
        <groupId>ch.qos.logback</groupId>
        <artifactId>logback-core</artifactId>
        <version>${logback.version}</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>3.8.1</version>
        <scope>test</scope>
    </dependency>
</dependencies>

my logback code is :

LoggerContext lc = (LoggerContext) LoggerFactory.getILoggerFactory();
JoranConfigurator configurator = new JoranConfigurator();
configurator.setContext(lc);
configurator.doConfigure(externalConfigFileLocation);

my spark-submit command is :

~/spark-2.1.1-bin-hadoop2.6/bin/spark-submit --master yarn --deploy-mode client --driver-memory 4g --executor-memory 2g --executor-cores 4 --class consumer.spark.Sample ~/SparkStreamingJob/log_testing.jar ~/SparkStreamingJob/spark-jobs/config/conf/logback.xml

1
What are the imported packages for your logback code? Makes it hard to see which implementation you're using exactly. Specifically, which package does LoggerFactory come from?Alex Savitsky

1 Answers

0
votes

It's appears there are two issues here:

SLF4J is the facade for logging implementations which basically means you could change between logging frameworks without code changes. which also means you shouldn't use the respective logging implementation core classes. SLF4J by itself resolve the logging implementation and the "logger" or "factory" object provided by SLF4j is bound to that implementation(in your case logback). All this means you can't cast explicitly the "logger" object or "factory" provided by SLF4j to logback API types.

Also it appears SLF4J resolving log4jLoggerFactory instead of LogbackLoggerFactory. I believe the bridging of SLF4J and Logback is not successful.