1
votes

I am using hadoop 2.7.0, hive 1.2.0 and HBase 1.0.1.1

I have created a simple table in HBase

hbase(main):021:0> create 'hbasetohive', 'colFamily'
0 row(s) in 0.2680 seconds

=> Hbase::Table - hbasetohive
hbase(main):022:0> put 'hbasetohive', '1s', 'colFamily:val','1strowval'
0 row(s) in 0.0280 seconds

hbase(main):023:0> scan 'hbasetohive'
ROW                                  COLUMN+CELL                                                                                               
 1s                                  column=colFamily:val, timestamp=1434644858733, value=1strowval                                            
1 row(s) in 0.0170 seconds

Now I have tried to access this HBase table through Hive external table. But while select from external table I am getting below error.

hive (default)> CREATE EXTERNAL TABLE hbase_hivetable_k(key string, value string)
              > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
              > WITH SERDEPROPERTIES ("hbase.columns.mapping" = "colFamily:val")
              > TBLPROPERTIES("hbase.table.name" = "hbasetohive");
OK
Time taken: 1.688 seconds
hive (default)> Select * from hbase_hivetable_k;
OK
hbase_hivetable_k.key	hbase_hivetable_k.value
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html#release for an explanation.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setCaching(I)V
	at org.apache.hadoop.hive.hbase.HiveHBaseInputFormatUtil.getScan(HiveHBaseInputFormatUtil.java:123)
	at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:99)
	at org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit.getRecordReader(FetchOperator.java:673)
	at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:323)
	at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:445)
	at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:414)
	at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:140)
	at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1667)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

It is totally coming out of hive prompt it self.

Can someone please tell me what is the issue here.

The below .hiverc also I am using from hive/conf directory :

SET hive.cli.print.header=true;
set hive.cli.print.current.db=true;
set hive.auto.convert.join=true;
SET hbase.scan.cacheblock=0;
SET hbase.scan.cache=10000;
SET hbase.client.scanner.cache=10000;
add JAR /usr/lib/hive/auxlib/zookeeper-3.4.6.jar;
add JAR /usr/lib/hive/auxlib/hive-hbase-handler-1.2.0.jar;
add JAR /usr/lib/hive/auxlib/guava-14.0.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-common-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-client-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-hadoop2-compat-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-hadoop-compat-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/commons-configuration-1.6.jar;
add JAR /usr/lib/hive/auxlib/hadoop-common-2.7.0.jar;
add JAR /usr/lib/hive/auxlib/hbase-annotations-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-it-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-prefix-tree-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-protocol-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-rest-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-server-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-shell-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/hbase-thrift-1.0.1.1.jar;
add JAR /usr/lib/hive/auxlib/high-scale-lib-1.1.1.jar;
add JAR /usr/lib/hive/auxlib/hive-serde-1.2.0.jar;
add JAR /usr/lib/hbase/lib/commons-beanutils-1.7.0.jar;
add JAR /usr/lib/hbase/lib/commons-beanutils-core-1.8.0.jar;
add JAR /usr/lib/hbase/lib/commons-cli-1.2.jar;
add JAR /usr/lib/hbase/lib/commons-codec-1.9.jar;
add JAR /usr/lib/hbase/lib/commons-collections-3.2.1.jar;
add JAR /usr/lib/hbase/lib/commons-compress-1.4.1.jar;
add JAR /usr/lib/hbase/lib/commons-digester-1.8.jar;
add JAR /usr/lib/hbase/lib/commons-el-1.0.jar;
add JAR /usr/lib/hbase/lib/commons-io-2.4.jar;
add JAR /usr/lib/hbase/lib/htrace-core-3.1.0-incubating.jar;
add JAR /usr/local/src/spark/lib/spark-assembly-1.3.1-hadoop2.6.0.jar;
1
Anyone please suggest what is wrong here. This error started coming after HBase migration from 1.0.1 to 1.0.1.1 version with my prior version of hive i.e. 1.1.0Koushik Chandra

1 Answers

0
votes

I was having the same issue, actually the issue because of Hive 1.2.0 version is not compatible with hbase version 1.x.

As mentioned in HBaseIntegration:

Version information As of Hive 0.9.0 the HBase integration requires at least HBase 0.92, earlier versions of Hive were working with HBase 0.89/0.90

Version information Hive 1.x will remain compatible with HBase 0.98.x and lower versions. Hive 2.x will be compatible with HBase 1.x and higher. (See HIVE-10990 for details.) Consumers wanting to work with HBase 1.x using Hive 1.x will need to compile Hive 1.x stream code themselves.

So to make hive 1.x work with hbase 1.x you have to download the source code of hive 2.0 branch from hive on github and build it, after building replace the hive-hbase-handler jar file with the newer version then it will work.