2
votes

Which Scala version works with Spark 2.2.0 ? I'm getting following error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

4
Is that happening with scala test ?Avishek Bhattacharya
I got this error fixed and now came up with a new one.The error was removed by adding dependency in build.sbtHussain Asghar
I'm still getting the error, I think it's version conflict, but tried everything and it still didn't worked.Hussain Asghar
similar question heregeo
name := "Scala-Spark" version := "1.0" scalaVersion := "2.11.8" // mvnrepository.com/artifact/org.apache.spark/spark-core_2.10 libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0" copy & paste this in build.sbt. It will definately work.Hussain Asghar

4 Answers

3
votes

From the documentation here:

Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.2.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).

1
votes

Spark 2.2.0 is built and distributed to work with Scala 2.11 by default. To write applications in Scala, you will need to use a compatible Scala version (e.g. 2.11.X). And your scala version might be 2.12.X. That's why it is throwing exception.

1
votes

To select appropriate scala version for your spark application one could run spark-shell on the target server. Desired scala version is contained in the welcome message:

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0.2.6.3.0-235
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152)

It is 2.11.8 in my Spark distributive.

Also there are pages on MVN repository contained scala version for one's spark distribution:

https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11

https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12

0
votes

Spark 2.2.0 needs Java 8+ and scala 2.11. Thats about the version info.

But, looking at your error "Exception in thread "main" java.lang.NoSuchMethodError: ", it seems your Spark is unable to find the driver class.

Probably you should be looking in this direction rather than versions.