I am trying to read a HBase table using Spark Scala API.
Sample Code:
conf.set("hbase.master", "localhost:60000")
conf.set("hbase.zookeeper.quorum", "localhost")
conf.set(TableInputFormat.INPUT_TABLE, tableName)
val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[ImmutableBytesWritable], classOf[Result])
println("Number of Records found : " + hBaseRDD.count())
How to add where
clause if i use newAPIHadoopRDD
?
Or we need to use any Spark Hbase Connector
to achieve this?
I saw the below Spark Hbase connector, but i don't see any example code with where clause.