0
votes

I tried to connect my couchBase server to EMR Spark 1.4.1, while encountered the

val airlines = sqlContext.read.couchbase(schemaFilter = org.apache.spark.sql.sources.EqualTo("type", "airline"))

<console>:24: error: value couchbase is not a member of org.apache.spark.sql.DataFrameReader

Those are all commands executed successfully before that error command:

  1. spark-shell --packages com.couchbase.client:spark-connector_2.10:1.0.0

  2. import org.apache.spark.{SparkContext, SparkConf}

  3. val sc = new SparkContext(new SparkConf().setAppName("test").set("com.couchbase.bucket.travel-sample", ""))
  4. val cfg = new SparkConf().setAppName("keyValueExample").setMaster("local[*]").set("com.couchbase.bucket.travel-sample", "")
  5. import org.apache.spark.sql.SQLContext
  6. val sql = new SQLContext(sc)
  7. import com.couchbase.spark._

Do I need to configure anything more? Since I'm using AWS EMR, I assumed that I don't have to modify the .sbt file? I think I have already imported the package either while specifying when connecting to spark-shell, or in line(command) 7?

1

1 Answers

1
votes

Documentation says you have to import the following:

scala> import com.couchbase.spark._
import com.couchbase.spark._

scala> import com.couchbase.spark.sql._
import com.couchbase.spark.sql._

Full doc is available here: http://developer.couchbase.com/documentation/server/current/connectors/spark-1.0/spark-shell.html