sbt package
runs just fine, but after spark-submit
I get the error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions; at SmokeStack$.main(SmokeStack.scala:46) at SmokeStack.main(SmokeStack.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Here is the offending line:
val sigCounts = rowData.map(row => (row("Signature"), 1)).countByKey()
rowData
is an RDD Map[String, String]. "Signature" key exists in all items in the map.
I suspect this may be a build issue. Below is my sbt file:
name := "Example1"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
scalacOptions ++= Seq("-feature")
I'm new to Scala so maybe the imports are not correct? I have:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import scala.io.Source