1
votes

I was trying to build a very basic scala script having spark dependencies . But I am not able to make the jar out of it .

The error generated :

sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;1.6.0-SNAPSHOT: not found

My build.sbt :

    import Dependencies._

    lazy val root = (project in file(".")).
     settings(
               inThisBuild(List(
                                 organization := "com.example",
                                 scalaVersion := "2.12.1",
                                 version      := "0.1.0-SNAPSHOT"
                              )),
               name := "Hello",
               libraryDependencies +=  "org.apache.spark" %% "spark-core" % "1.6.0-SNAPSHOT",
               resolvers += Resolver.mavenLocal
                )

`

package example
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object Hello  {
     def main(args: Array[String]) {
           val logFile = "/Users/dhruvsha/Applications/spark/README.md"                 
           val conf = new SparkConf().setAppName("Simple Application")
           val sc = new SparkContext(conf)
           val logData = sc.textFile(logFile, 2).cache()
           val numAs = logData.filter(line => line.contains("a")).count()
           val numBs = logData.filter(line => line.contains("b")).count()
           println(s"Lines with a: $numAs, Lines with b: $numBs")
           sc.stop()
         }
}

My source scala is in :

/exampleapp/main/scala/example/Hello.scala

Project name is exampleapp .

scala version 2.12.2

spark version 1.6.0

sbt version 0.13.13

Any sort of help will be appreciated and would be great if you could provide resources for learning about sbt and spark dependencies .

Please see I am new to scala, spark and sbt .

1

1 Answers

1
votes

The library dependencies line in build.sbt seems wrong

correct should be as

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0"