1
votes

I am trying to query Cassandra using Spark with the Datastax Spark-Cassandra connector. The Spark code is

  val conf = new SparkConf(true)
    .setMaster("local[4]")
    .setAppName("cassandra_query")
    .set("spark.cassandra.connection.host", "mycassandrahost")

  val sc = new SparkContext(conf)

  val rdd = sc.cassandraTable("mykeyspace", "mytable").limit(10) 

  rdd.foreach(println)
  sc.stop() 

So its just running locally now. And my build.sbt file looks like

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.0.0",
  "org.apache.spark" %% "spark-sql" % "2.0.0",
  "cc.mallet" % "mallet" % "2.0.7",
  "com.amazonaws" % "aws-java-sdk" % "1.11.229",
  "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0"
)

I create a fat jar using the assembly plugin and when I submit the spark job I get the following error

 Lost task 6.0 in stage 0.0 (TID 6) on executor localhost: java.io.IOException (Exception during preparation of SELECT "pcid", "content" FROM "mykeyspace"."mytable" WHERE token("pcid") > ? AND token("pcid") <= ?  LIMIT 10 ALLOW FILTERING: class java.time.LocalDate in JavaMirror with org.apache.spark.util.MutableURLClassLoader@923288b of type class org.apache.spark.util.MutableURLClassLoader with classpath [file:/root/GenderPrediction-assembly-0.1.jar] and parent being sun.misc.Launcher$AppClassLoader@1e69dff6 of type class sun.misc.Launcher$AppClassLoader with classpath [file:/root/spark/conf/,file:/root/spark/jars/datanucleus-core-3.2.10.jar,...not found.

(Note: there were too many jars listed in the above classpath so I just replaced them with a "..." )

So it looks like it can't find java.time.LocalDate - how can I fix this?

I found another post that looks similar spark job cassandra error However it is a different class that cannot be found so I'm not sure if it helps.

2

2 Answers

1
votes

Can you plz try this

    libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "2.0.0",
    "org.apache.spark" %% "spark-sql" % "2.0.0",
    "cc.mallet" % "mallet" % "2.0.7",
    "com.amazonaws" % "aws-java-sdk" % "1.11.229",
    "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0" exclude("joda-time", "joda-time"),
    "joda-time" % "joda-time" % "2.3"

)

2
votes

java.time.LocalDate is part of Java8 and it seems you are running java version lower than 8.

spark-cassandra-connector 2.0 requires java 8. Spark Cassandra version compatibility