0
votes

I'm trying to use Spark Streaming with Scala but I'm getting errors and I can't figure out why.
The StreamingContext is the line giving the errors:

val sparkConf = new SparkConf().setAppName("App_StreamingConsumer")
val ssc = new StreamingContext(sparkConf, Seconds(2))

These are the 2 errors:

bad symbolic reference. A signature in StreamingContext.class refers to term conf in package org.apache.hadoop which is not available. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling StreamingContext.class.

And:

missing or invalid dependency detected while loading class file 'StreamingContext.class'. Could not access term conf in package org.apache.hadoop, because it (or its dependencies) are missing. Check your build definition for missing or conflicting dependencies. (Re-run with -Ylog-classpath to see the problematic classpath.) A full rebuild may help if 'StreamingContext.class' was compiled against an incompatible version of org.apache.hadoop.

This question has been asked before: Spark Streaming StreamingContext error the errors seem to be coming from a dependency issue but as far as I can tell my dependencies are all in order.

2

2 Answers

0
votes

I just needed to include hadoop-core in my dependencies, I had hadoop-client and didn't realise I also needed core.

0
votes

I have encountered a similar problem, the reason is due to in the [build path] missing [hadoop] dependency. Here's my solution: In the build.sbt file, add: "org.apache.hadoop"% "hadoop-hdfs"% "2.6.0" and run the command: [sbt eclipse], which will automatically add the [hadoop] dependency package to the project In the [build path].

Note:Be sure to use the [sbt eclipse] command,not [sbt update]