I am trying to do a spark submit to check the compatibility with some simple scala code
println("Hi there")
val p = Some("pop")
p match {
  case Some(a) => println("Matched " + a)
  case _ => println("00000009")
}
scala version: 2.12.5 spark version: 2.4.6
currently after building and running the jar through spark-submit 2.4.7 it gives:
Hi there
Exception in thread "main" java.lang.NoSuchMethodError: scala.Some.value()Ljava/lang/Object;
    at MangoPop$.main(MangoPop.scala:9)
    at MangoPop.main(MangoPop.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
From maven, it seems spark 2.4.6 supports scala 2.12 https://mvnrepository.com/artifact/org.apache.spark/spark-core
But when running with spark submit 3.0.2, it runs fine.
What am i missing with spark 2.4.6
(also tried with spark 2.4.7, even though there is no actual spark dependencies/code, only scala)
Running spark submit as
~/Downloads/spark-2.4.7-bin-hadoop2.7/bin$  ./spark-submit --class=Test myprojectLocation..../target/scala-2.12/compatibility-check_2.12-0.1.jar
/spark-2.4.7-bin-hadoop2.7/bin$ ./spark-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.7
      /_/
                        
Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_282
Branch HEAD
Compiled by user prashant on 2020-09-08T05:22:44Z
Revision 14211a19f53bd0f413396582c8970e3e0a74281d
Url https://prashant:Sharma1988%[email protected]/repos/asf/spark.git
Type --help for more information.
also tried 2.4.6 downloading from https://archive.apache.org/dist/spark/spark-2.4.6/
but could not find for scala 2.12
Can we also explicitly mention which scala version to use when doing spark-submit or spark-shell as in the configuration it seems it supports both but it used the lower one, ie 2.11
This is load-spark-env.cmd file
rem Setting SPARK_SCALA_VERSION if not already set.
set ASSEMBLY_DIR2="%SPARK_HOME%\assembly\target\scala-2.11"
set ASSEMBLY_DIR1="%SPARK_HOME%\assembly\target\scala-2.12"
    
2.4.6could be cross-compiled with Scala2.12but an installation has to pick one exact version and it is very probably that your installation is using Scala2.11- Luis Miguel Mejía Suárezbuild.sbtfile - Luis Miguel Mejía Suárez