0
votes

I've been training myself on Scala, IntelliJ, and Spark. I downloaded 'org.apache.spark:spark-core_2.10:1.1.1' into my IntelliJ project, and I'm trying to apply some training that I've taken on Spark. I'm trying to make it easy on myself by using either Scala worksheets (preferred) or Scala console to test out ideas.


Code snippet:

import org.apache.spark
import org.apache.spark._
val sc = new SparkContext("local[2]", "sc1")

When I try to create a SparkContext, I get an error:

In the scala worksheet: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'

In the scala console: java.lang.NoClassDefFoundError: Could not initialize class akka.actor.ActorCell$

I don't have any experience with akka, I didn't want to use akka. Is it required for Spark? Am I using the wrong scala library?


I read other questions with the same errors, but so far, the suggestions haven't helped.

I added the following dependencies to my build.sbt file: libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0"

I'll be happy to answer further questions. I'm not sure what else to add in order to make this question easier to answer.

Thanks,

David Webb

1
Have you read this?Yuval Itzchakov

1 Answers

0
votes

I got past this error by updating my SBT dependencies to:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0"

I also downloaded org.apache.spark:spark-core_2.11-1.6.0.jar IntelliJ was previously finding only spark-core_2.10-1.1.1.jar when it searched maven.

I'm still unable to create a spark context, but I'm getting a different error now. I'll search on the new error and see what I can find before I post another question.

David Webb