0
votes

I am trying to do a spark submit to check the compatibility with some simple scala code

println("Hi there")

val p = Some("pop")
p match {
  case Some(a) => println("Matched " + a)
  case _ => println("00000009")
}

scala version: 2.12.5 spark version: 2.4.6

currently after building and running the jar through spark-submit 2.4.7 it gives:

Hi there
Exception in thread "main" java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
    at MangoPop$.main(MangoPop.scala:9)
    at MangoPop.main(MangoPop.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

From maven, it seems spark 2.4.6 supports scala 2.12 https://mvnrepository.com/artifact/org.apache.spark/spark-core


But when running with spark submit 3.0.2, it runs fine.
What am i missing with spark 2.4.6
(also tried with spark 2.4.7, even though there is no actual spark dependencies/code, only scala)

Running spark submit as

~/Downloads/spark-2.4.7-bin-hadoop2.7/bin$  ./spark-submit --class=Test myprojectLocation..../target/scala-2.12/compatibility-check_2.12-0.1.jar


/spark-2.4.7-bin-hadoop2.7/bin$ ./spark-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.7
      /_/
                        
Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_282
Branch HEAD
Compiled by user prashant on 2020-09-08T05:22:44Z
Revision 14211a19f53bd0f413396582c8970e3e0a74281d
Url https://prashant:Sharma1988%[email protected]/repos/asf/spark.git
Type --help for more information.

also tried 2.4.6 downloading from https://archive.apache.org/dist/spark/spark-2.4.6/

but could not find for scala 2.12


Can we also explicitly mention which scala version to use when doing spark-submit or spark-shell as in the configuration it seems it supports both but it used the lower one, ie 2.11
This is load-spark-env.cmd file
rem Setting SPARK_SCALA_VERSION if not already set.

set ASSEMBLY_DIR2="%SPARK_HOME%\assembly\target\scala-2.11"
set ASSEMBLY_DIR1="%SPARK_HOME%\assembly\target\scala-2.12"
1
Change scala version to 2.11.8, and make sure that you main class is defined in build.sbt - itIsNaz
Yes Spark 2.4.6 could be cross-compiled with Scala 2.12 but an installation has to pick one exact version and it is very probably that your installation is using Scala 2.11 - Luis Miguel Mejía Suárez
Hi @NassereddineBelghith , actually i were to using scala 2.12 - supernatural
Hi @LuisMiguelMejíaSuárez , I downloaded spark 2.4.7 from spark.apache.org/news/spark-2-4-7-released.html . but 2.4.7 also seems it supports scala 2.12, btw which installation you were refering to ? - supernatural
@supernatural that one you downloaded and installed using spark-submit, run spark-shell that will show you which exact Scala version it is using, ideally you should use that one even the bugfix version (the third number). - BTW, if you just want to learn Spark I would recommend just running it from sbt instead of installing it in your computer, that way it will use the Scala version you set in your build.sbt file - Luis Miguel Mejía Suárez

1 Answers

0
votes

The issue is that the runtime version of Spark is "Using Scala version 2.11.12" while your code (MangoPop$.main(MangoPop.scala:9)) uses "scala version: 2.12.5".

Make sure the build and runtime versions of Spark are at the same Scala version.