1
votes

I'm trying to create simple Apache Spark application that will connect to Cassandra using Datastax Cassandra connector and do some operation and am getting error

Symbol 'type <none>.package.DataFrame' is missing from the classpath.

My build.sbt:

name := "spark-app"
version := "1.0"
scalaVersion := "2.11.11"


libraryDependencies ++= Seq(
  "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0",
  "org.apache.spark" %% "spark-core" % "2.1.1" % "provided"
)

resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"

My simple application:

package com.budgetbakers.be.dwh.spark
import com.datastax.spark.connector._
import org.apache.spark.{SparkConf, SparkContext}

object Distinct {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf(true)
      .set("spark.cassandra.connection.host", "127.0.0.1")

    val sc = new SparkContext(conf)
    println(sc.cassandraTable("ks", "users").select("gender").distinct().collect().mkString(","))
    sc.stop()
  }
}

when i try to package project i get following compilation error:

[error] /.../Distinct.scala:18: Symbol 'type <none>.package.DataFrame' is missing from the classpath.
[error] This symbol is required by 'value com.datastax.spark.connector.package.dataFrame'.
[error] Make sure that type DataFrame is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'package.class' was compiled against an incompatible version of <none>.package.
[error]     println(sc.cassandraTable("ks", "users").select("gender").distinct().collect().mkString(","))
[error]             ^

Am I missing something? Maybe there is some dependency conflict?

Versions of apps I use:

  • cassandra: 3.1
  • apache spark: 2.1.1
  • spark cassandra connector: 2.0.0
  • scala: 2.11
  • sbt: 0.13.15
  • sbt assembly plugin: 0.14.0
1

1 Answers

5
votes

Try adding the spark-sql dependency as well as the core library. For future reference there are example build files here