0
votes

I am using hadoop 2.7.2 , hbase 1.4.9, spark 2.2.0, scala 2.11.8 and java 1.8 on a hadoop cluster which is composed of one master and two slave.

when I run spark-shell after starting the cluster , it works fine. I am trying to connect to hbase using scala by following this tutorial : [https://www.youtube.com/watch?v=gGwB0kCcdu0][1] .

But when I try like he does to run the spark-shell by adding those jars like argument I have this error:

spark-shell --jars "hbase-annotations-1.4.9.jar,hbase-common-1.4.9.jar,hbase-protocol-1.4.9.jar,htrace-core-3.1.0-incubating.jar,zookeeper-3.4.6.jar,hbase-client-1.4.9.jar,hbase-hadoop2-compat-1.4.9.jar,metrics-json-3.1.2.jar,hbase-server-1.4.9.jar"

<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^

and after that even I log out and run spark-shell another time I have the same issue. Can any one tell me please what is the cause and how to fix it .

1

1 Answers

0
votes

In your import statement spark should be an object of type SparkSession. That object should have been created previously for you. Or you need to create it yourself (read spark docs). I didn't watch your tutorial video.

The point is it doesn't have to be called spark. It could be for instance called sparkSession and then you can do import sparkSession.implicits._