0
votes

I created a simple SBT project in IntelliJ IDE with the following library dependencies in build.sbt:

import _root_.sbt.Keys._

name := "untitled"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "1.5.1",
  "org.apache.spark" %% "spark-sql" % "1.5.1" ,
  "org.apache.spark" %% "spark-mllib"  % "1.5.1")

The objective is to import Spark and MLLIB of Spark, and then to create a Scala object as explained here.

However, the following error occurs on importing:

SBT project import
[warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version: [warn] *

org.scala-lang:scala-compiler:(2.11.0, 2.11.7) [warn] * org.apache.commons:commons-lang3:(3.3.2, 3.0) [warn] * jline:jline:(0.9.94, 2.12.1) [warn] * org.scala-lang.modules:scala-parser-combinators_2.11:(1.0.1, 1.0.4) [warn] * org.scala-lang.modules:scala-xml_2.11:(1.0.1, 1.0.4) [warn] * org.slf4j:slf4j-api:(1.7.10, 1.7.2) [warn] [FAILED ] net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar(src): (0ms) [warn] ==== local: tried [warn] C:\Users\Cezar.ivy2\local\net.sourceforge.f2j\arpack_combined_all\0.1\srcs\arpack_combined_all-sources.jar [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-sources.jar [warn] [FAILED ] javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(doc): (0ms) [warn] ==== local: tried [warn] C:\Users\Cezar.ivy2\local\javax.xml.bind\jsr173_api\1.0\docs\jsr173_api-javadoc.jar [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1.0-javadoc.jar [warn] [FAILED ] javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(src): (0ms) [warn] ==== local: tried [warn] C:\Users\Cezar.ivy2\local\javax.xml.bind\jsr173_api\1.0\srcs\jsr173_api-sources.jar [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1.0-sources.jar [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: FAILED DOWNLOADS :: [warn] :: ^ see resolution messages for details ^ :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar(src) [warn] :: javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(doc) [warn] :: javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(src) [warn] ::::::::::::::::::::::::::::::::::::::::::::::

1
Check your which Scala version your Spark installation is using!eliasah
@eliasah: How to check it? I can see "Running Spark version 1.5.1" and I'm using Scala 2.11Klausos Klausos
Did you use a prebuilt version or built it yourself?eliasah
@eliasah: I use prebuilt version.Klausos Klausos
Usually the prebuilt versions support Scala 2.10. You'll have to build it yourself. You can find the procedure on the official documentation.eliasah

1 Answers

1
votes

Spark will not work with Scala 2.11. It uses Scala 2.10, so you need to use a compatible Scala version (see http://spark.apache.org/docs/latest/).

Alternatively, you can, as @eliasah has also mentioned in a comment, build Spark yourself. Instructions on how to build Spark can be found at http://spark.apache.org/docs/latest/building-spark.html