I am unable to write to Hbase using the example mentioned in readme. Here is a simple code that illustrates my approach and the error I encounter
import org.apache.spark.sql.execution.datasources.hbase._
val input = Seq(
| ("a:1", "null", "null", "3", "4", "5", "6"),
| ("b:1", "2", "3", "null", "null", "5", "6")
)
val df = input.toDF
val TIMELINE_TABLE = "test_timeline"
val timelineCatalog =
s"""
"table":{"namespace":"default", "name":""".stripMargin+ TIMELINE_TABLE +"""", "tableCoder":"PrimitiveType"},
|"rowkey":"key",
|"columns":{
|"col0":{"cf":"rowkey", "col":"key", "type":"string"},
|"col1":{"cf":"main", "col":"kx", "type":"string"},
|"col2":{"cf":"main", "col":"ky", "type":"string"},
|"col3":{"cf":"main", "col":"rx", "type":"string"},
|"col4":{"cf":"main", "col":"ry", "type":string"},
|"col5":{"cf":"main", "col":"wx", "type":"string"},
|"col6":{"cf":"main", "col":"wy", "type":"string"}
|}
|}""".stripMargin
val HBASE_CONF_FILE = "/etc/hbase/conf/hbase-site.xml"
df.write.options(Map(HBaseTableCatalog.tableCatalog -> timelineCatalog)).format("org.apache.spark.sql.execution.datasources.hbase").save()
java.lang.ClassCastException: org.json4s.JsonAST$JString cannot be cast to org.json4s.JsonAST$JObject
at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:257)
at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:59)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:518)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
... 50 elided
My Scala version is 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131)
, and I'm on Spark 2.1.0.2.6.0.10-29
and hbase 1.1.2
I am using hortonworks-spark/shc connector and I am unable to find anything wrong with my data. It is actually a clean version of it. In ideal scenario I would want the string "null"
to be actual null
. But I couldn't get that to work, so I thought creating strings would at least get it working. Any help would be highly appreciated. I have also raised an issue on Github as well https://github.com/hortonworks-spark/shc/issues/172