5
votes

version := "1.0"
scalaVersion := "2.11.8"
ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"

I try to get spark into my develop environment, when i try to assembly jar by sbt, but it failed and showed [error] in my sbt just like below:

[warn]  :::::::::::::::::::::::::::::::::::::::::::::: <br/>
[warn]  ::          UNRESOLVED DEPENDENCIES         :: <br/>
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: <br/>
[warn]  :: org.apache.spark#spark-core_2.11;2.1.0: not found <br/>
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: <br/>
[warn] 
[warn]  Note: Unresolved dependencies path: <br/>
[warn]      org.apache.spark:spark-core_2.11:2.1.0  (D:\MyDocument\IDEA\Scala\model\build.sbt#L9-10) <br/>
[warn]        +- org.apache.spark:spark-catalyst_2.11:2.1.0 <br/>
[warn]        +- org.apache.spark:spark-sql_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L15-16) <br/>
[warn]        +- org.apache.spark:spark-hive_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L11-12) <br/>
[warn]        +- default:producttagmodel_2.11:1.0 <br/>
[trace] Stack trace suppressed: run 'last *:update' for the full output. <br/>
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11;2.1.0: not found 

my IntelliJ version is 2016.3.5, and sbt version is 0.13.13 and scala version is 2.11.8; i found that sbt have download the spark-core.jar successfully which i find it in my .ivy/cache directory, but it always show "unknown artifact. Not Resovled or Indexed". I have refresh my project index many times but it didn't work. i create a new project by using the same build.sbt in case of the IntelliJ cache disturb, but it didn't work. I am totally confused of this problem.
here is my build.sbt setting below:

enter image description here

1

1 Answers

0
votes

How about changing the dependency to :

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"

In addition for a spark app, this is normally added as "provided", it should not be included in the jar, as when you submit your job, the related spark libs are installed already in driver and executors.