0
votes

I want to work with the Kafka integration for Spark streaming. I use Spark version 2.0.0.

But I get a unresolved dependency error ("unresolved dependency: org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found").

How can I accesss this package? Or am I doing something wrong/missing?

My build.sbt file:

name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.apache.spark" %% "spark-streaming" % sparkVersion,
    "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"

Thank you for you help.

1
did the accepted answer work for you. i am getting the same error again.Manoj

1 Answers

2
votes

https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10_2.11

libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.0.0"