I've been training myself on Scala, IntelliJ, and Spark. I downloaded 'org.apache.spark:spark-core_2.10:1.1.1' into my IntelliJ project, and I'm trying to apply some training that I've taken on Spark. I'm trying to make it easy on myself by using either Scala worksheets (preferred) or Scala console to test out ideas.
Code snippet:
import org.apache.spark
import org.apache.spark._
val sc = new SparkContext("local[2]", "sc1")
When I try to create a SparkContext, I get an error:
In the scala worksheet: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
In the scala console: java.lang.NoClassDefFoundError: Could not initialize class akka.actor.ActorCell$
I don't have any experience with akka, I didn't want to use akka. Is it required for Spark? Am I using the wrong scala library?
I read other questions with the same errors, but so far, the suggestions haven't helped.
I added the following dependencies to my build.sbt file: libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0"
I'll be happy to answer further questions. I'm not sure what else to add in order to make this question easier to answer.
Thanks,
David Webb