0
votes

I am a newbie to hadoop, hbase and hive environment. I configured with the hadoop and hbase node with fully distributed mode. I have inserted some sample data in hive table.

I have created one table hive for hbase table for access the hbase table data thorugh hive and read the data from hive table also vice versa like

create external table testing 
(key string
, name string
, age int
, year int
, salary int) 
stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 
WITH SERDEPROPERTIES 
(hbase.columns.mapping = :key,cf1:name,cf1:id,cf2:year,cf2:salary)
TBLPROPERTIES (hbase.table.name = basetbalename)

and the hbase table i created two column family .

create 'hbasetablename' ,'cf1', 'cf2'

Everything is working fine.But while am overwriting the data from hive to hbase table getting the error. I dont know why the error is coming.

Am trying to overwrite the data like this

hbase> insert overwrite table sample select concat(name, ':', age) as key,name,age,year,salary from empdetail;'

While am overwrite the table getting the following error.

Query ID = master_20150411092222_f7cf8d94-8449-48f3-8aff-06ef8213f776 Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator Starting Job = job_1428713373268_0001, Tracking URL = http://master:8088/proxy/application_1428713373268_0001/ Kill Command = /home/master/hadoop/bin/hadoop job -kill job_1428713373268_0001 Hadoop job information for Stage-0: number of mappers: 1; number of reducers: 0 2015-04-11 09:23:40,198 Stage-0 map = 0%,
reduce = 0% 2015-04-11 09:24:07,628 Stage-0 map = 100%, reduce = 0%, Cumulative CPU3.25 sec 2015-04-11 09:24:08,674 Stage-0 map = 0%,reduce = 0% 2015-04-11 09:24:41,742 Stage-0 map = 100%, reduce = 0% MapReduce Total cumulative CPU time: 3 seconds 250 msec Ended Job = job_1428713373268_0001 with errors Error during job, obtaining debugging information.. Job Tracking URL: http://master:8088/proxy/application_1428713373268_0001/ Examining task ID: task_1428713373268_0001_m_000000 (and more)rom job job_1428713373268_0001 Task with the most failures(4):Task ID:task_1428713373268_0001_m_000000 URL:http://master:8088/taskdetails.jsp?jobid=job_1428713373268_0001&tipid=task_1428713373268_0001_m_000000 Diagnostic Messages for this Task: Error: java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Put.setDurability(Lorg/apache/hadoop/hbase/client/Durability;)V
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:179)at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) atjava.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:415)at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.NoSuchMethodError:org.apache.hadoop.hbase.client.Put.setDurability(Lorg/apache/hadoop/hbase/client/Durability;)V at org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(HiveHBaseTableOutputFormat.java:142)at org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(HiveHBaseTableOutputFormat.java:117)at org.apache.hadoop.hive.ql.io.HivePassThroughRecordWriter.write(HivePassThroughRecordWriter.java:40)at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:714)at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)atorg.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:493) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:170) .. 8 more FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-0: Map: 1 Cumulative CPU: 3.25 sec HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 3 seconds 250 msec

I dont know how to resolve this bug.In hive i set classpath some jar files as

    add jar ${env:HIVE_HOME}/lib/hbase-hadoop2-compat-1.0.0.jar;
    add jar ${env:HIVE_HOME}/lib/hbase-client-1.0.0.jar;
    add jar ${env:HIVE_HOME}/lib/zookeeper-3.4.6.jar;
    add jar ${env:HIVE_HOME}/lib/hive-hbase-handler-1.1.0.jar;
    add jar ${env:HIVE_HOME}/lib/guava-14.0.1.jar;
    add jar ${env:HIVE_HOME}/lib/guava-11.0.2.jar;
    add jar ${env:HIVE_HOME}/lib/hbase-common-1.0.0.jar;
    add jar ${env:HIVE_HOME}/lib/protobuf-java-2.5.0.jar;
    add jar ${env:HIVE_HOME}/lib/hbase-protocol-1.0.0.jar;
    add jar ${env:HIVE_HOME}/lib/hbase-server-1.0.0.jar;
    add jar ${env:HIVE_HOME}/lib/htrace-core-3.1.0-incubating.jar;

After i set the value in hive-site.xml file also the hive aux jar class path but is not working.

I set the dynamic partition also in hive shell.

set hive.exec.dynamic.partition=true 
set hive.exec.dynamic.partition.mode=nonstrict;

But it's not working. If i did any alignment and spelling mistake please apologize for that.I don't know where the mistake .Please help me .

Thanks in advance.

1
Please any one help me .Alignment suggestion is not important now.Dharmaraja.k
Do you want sql querying on HBase data? If yes, then i would recommend Phoenix(phoenix.apache.org). It works far better with HBase.Anil Gupta
HI Anil Gupta. Thanks for the comments.Right now i have configured the hive and hbase.Is there any possible for this error in hive.I will check the phoenix.But i need to work with hive only. Can you please help me to resolve the hive bug.Thanks in advance.Dharmaraja.k

1 Answers

0
votes

Unfortunately, I believe that you will have to upgrade Hadoop's version.

It appears that Hive is trying to access the unset method of the org.apache.hadoop.hbase.client.Put class.