0
votes

Im new to Scala and spark and could do with some help regarding the above error. Here is a snippet of my code that is causing issues:

case class Session (user_id: String, creation_date: BigInt, offline: Boolean)
case class User (user_id: String, app_id: Int, vendor_code: String, app_version: String)

val users = sc.cassandraTable[User]("leech_seed", "user").select("user_id", "app_id", "vendor_code", "app_version").where("last_active >=" + (timestamp - 86400000))
val sessions = sc.cassandraTable[Session]("leech_seed", "session").select("user_id", "creation_date", "offline").where("creation_date < " + timestamp + " AND creation_date >=" + (timestamp - 86400000))

when i use this code in the spark shell it works fine but when i am trying to build a jar with sbt i get the following error could not find implicit value for evidence parameter of type com.datastax.spark.connector.rdd.reader.RowReaderFactory[User]

This has been doing my head in for longer than id like to admit so any help/insight would be greatly appreciated.

Note: I am using the datastax cassandra connector for spark

1
Which spark-cassandra connector are you using?Gillespie
im using the scala 2.10 connectrorMatt Indeedhat Holmes
I should also have asked with which version of Cassandra?Gillespie
That's this months update - I'm asking as I have had similar errors from using certain Spark versions with different connector versions. At the moment I am using Spark 1.4.1 with 2.10-1.4.0-M3 with no problems.Gillespie
Yes thats a standard one - your case classes need to be declared outside of your main methodGillespie

1 Answers

1
votes

Check your spark-cassandra connector version is up-to-date with the version of Spark you are using. I have encountered these issues using connector versions older than 2.10-1.4.0-M3 with Spark 1.4.1.

Also ensure that your case classes are defined outside of your main method - else you will encounter No RowReaderFactory can be found for this typeor similar.