I'm trying to run a sample scala code using spark_submit using SBT. And this is my scala code -
import scala.math.random
import org.apache.spark._
/** Computes an approximation to pi */
object SparkPi {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Spark Pi")
val spark = new SparkContext(conf)
val slices = if (args.length > 0) args(0).toInt else 2
val n = 100000 * slices
val count = spark.parallelize(1 to n, slices).map { i =>
val x = random * 2 - 1
val y = random * 2 - 1
if (x*x + y*y < 1) 1 else 0
}.reduce(_ + _)
println("Pi is roughly " + 4.0 * count / n)
spark.stop()
}
}
And this is my sparksample.sbt file -
name := "Spark Sample"
version := "1.0"
scalaVersion := "2.9.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
But when i run SBT and package command im getting the below error
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.9.1;2.0.0: not found
My scala version is 2.9.1 and my spark version is 2.0.0.
I'm following the below site for running spark_submit using sbt -