Which Scala version works with Spark 2.2.0 ? I'm getting following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
Which Scala version works with Spark 2.2.0 ? I'm getting following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
From the documentation here:
Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.2.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).
To select appropriate scala version for your spark application one could run spark-shell
on the target server. Desired scala version is contained in the welcome message:
Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0.2.6.3.0-235 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152)
It is 2.11.8
in my Spark distributive.
Also there are pages on MVN repository contained scala version for one's spark distribution:
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12