0
votes

I am having an error when trying to run plain Scala code in Spark similar to these posts: this and this

Their problem was that they were using the wrong Scala version to compile their Spark project. However, mine is the correct version.

I have Spark 1.6.0 installed on an AWS EMR cluster to run the program. The project is compiled on my local machine with Scala 2.11 installed and 2.11 listed in all dependencies and build files without any references to 2.10.

This is the exact line that throws the error:

var fieldsSeq: Seq[StructField] = Seq()

And this is the exact error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
at com.myproject.MyJob$.main(MyJob.scala:39)
at com.myproject.MyJob.main(MyJob.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
1
Could you show an example?Alberto Bonsanto
Spark 1.6 is based on Scala 2.10 not 2.11. Did you compile Spark specifically for 2.11? The accepted answer for your second link is this problem - binary compiled for 2.11 running on Spark (which is 2.10).Robert Horvick
I just looked that up on the Spark website and you're right -- Spark 1.6 is still based on 2.10. A coworker told me that 1.6 upped the Scala version to 2.11, so that's what I've been using the past couple weeks. My other Spark jobs have worked perfectly fine too.mcmcmc

1 Answers

0
votes

Spark 1.6 on EMR is still built with Scala 2.10, so yes, you are having the same issue as in the posts you linked. In order to use Spark on EMR, you currently must compile your application with Scala 2.10.

Spark has upgraded their default Scala version to 2.11 as of Spark 2.0 (to be released within the next several months), so once EMR supports Spark 2.0, we will likely follow this new default and compile Spark with Scala 2.11.