1
votes

I'm new to Spark and downloaded a pre-compiled Spark binaries from Apache (Spark-2.1.0-bin-hadoop2.7)

When submitting my scala (2.11.8) uber jar the cluster throw and error:

java.lang.IllegalStateException: Library directory '/root/spark/assembly/target/scala-2.10/jars' does not exist; make sure Spark is built

I'm not running Scala 2.10 and Spark isn't compiled (as much as I know) with Scala 2.10

Could it be that one of my dependencies is based on Scala 2.10 ?
Any suggestions what can be wrong ?

2
Can you share the list of dependencies that you are packaging in your uber jar? - himanshuIIITian
@himanshuIIITian "org.scalatest" %% "scalatest" % "3.0.1", "org.scalaj" %% "scalaj-http" % "2.3.0", "org.apache.spark" %% "spark-core" % "2.2.0" % "provided", "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided", "org.apache.spark" %% "spark-yarn" % "2.2.0", "org.apache.hadoop" % "hadoop-client" % "2.8.1", "org.apache.hadoop" % "hadoop-yarn-client" % "2.8.1", "org.apache.hive" % "hive-jdbc" % "2.3.0" - Y. Eliash

2 Answers

0
votes

Note sure what is wrong with the pre-built spark-2.1.0 but I've just downloaded spark 2.2.0 and it is working great.

0
votes

Try setting SPARK_HOME="location to your spark installation" on your system or IDE