0
votes

I have successfully exported a table in hbase version 0.98 using the following command (Using MapReduce API)

./hbase -Dhbase.export.version=0.98 org.apache.hadoop.hbase.mapreduce.Export testtable file:///home/test/test/hbase-0.98.4-hadoop2/bin/2

It created a folder in name '2' with the map reduce data of parts and success are being generated successfully and when i open the parts-000 file that has the data of the testtable.

But I try to import using import command,

./hbase -Dhbase.import.version=0.98 org.apache.hadoop.hbase.mapreduce.Import testtable file:///home/test/test/hbase-0.98.4-hadoop2/bin/2

It successfully runs,

Output:

2015-04-10 09:50:18,429 INFO [LocalJobRunner Map Task Executor #0] mapreduce.TableOutputFormat: Created table instance for testtable 2015-04-10 09:50:18,446 INFO [LocalJobRunner Map Task Executor #0] mapred.Task: Using ResourceCalculatorProcessTree : [ ] 2015-04-10 09:50:18,453 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask: Processing split: file:/home/test/test/hbase-0.98.4-hadoop2/bin/2/part-m-00000:0+907 2015-04-10 09:50:18,504 DEBUG [LocalJobRunner Map Task Executor #0] mapreduce.Import: No configured filter class, accepting all keyvalues. 2015-04-10 09:50:18,507 INFO [LocalJobRunner Map Task Executor #0] zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=1200000 watcher=attempt_local1700239962_0001_m_000000_0, quorum=localhost:2181, baseZNode=/hbase 2015-04-10 09:50:18,509 INFO [LocalJobRunner Map Task Executor #0] zookeeper.RecoverableZooKeeper: Process identifier=attempt_local1700239962_0001_m_000000_0 connecting to ZooKeeper ensemble=localhost:2181 2015-04-10 09:50:18,509 INFO [LocalJobRunner Map Task Executor #0-SendThread(localhost:2181)] zookeeper.ClientCnxn: Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 2015-04-10 09:50:18,511 INFO [LocalJobRunner Map Task Executor #0-SendThread(localhost:2181)] zookeeper.ClientCnxn: Socket connection established to localhost/0:0:0:0:0:0:0:1:2181, initiating session 2015-04-10 09:50:18,520 INFO [LocalJobRunner Map Task Executor #0-SendThread(localhost:2181)] zookeeper.ClientCnxn: Session establishment complete on server localhost/0:0:0:0:0:0:0:1:2181, sessionid = 0x14c73d562cc0043, negotiated timeout = 1200000 2015-04-10 09:50:18,532 INFO [LocalJobRunner Map Task Executor #0] zookeeper.ZooKeeper: Session: 0x14c73d562cc0043 closed 2015-04-10 09:50:18,532 INFO [LocalJobRunner Map Task Executor #0-EventThread] zookeeper.ClientCnxn: EventThread shut down 2015-04-10 09:50:18,568 INFO [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner: 2015-04-10 09:50:19,080 INFO [LocalJobRunner Map Task Executor #0] mapred.Task: Task:attempt_local1700239962_0001_m_000000_0 is done. And is in the process of committing 2015-04-10 09:50:19,096 INFO [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner: map 2015-04-10 09:50:19,096 INFO [LocalJobRunner Map Task Executor #0] mapred.Task: Task 'attempt_local1700239962_0001_m_000000_0' done. 2015-04-10 09:50:19,096 INFO [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner: Finishing task: attempt_local1700239962_0001_m_000000_0 2015-04-10 09:50:19,097 INFO [Thread-32] mapred.LocalJobRunner: Map task executor complete. 2015-04-10 09:50:19,170 INFO [main] mapreduce.Job: Job job_local1700239962_0001 running in uber mode : false 2015-04-10 09:50:19,172 INFO [main] mapreduce.Job: map 100% reduce 0% 2015-04-10 09:50:19,174 INFO [main] mapreduce.Job: Job job_local1700239962_0001 completed successfully 2015-04-10 09:50:19,195 INFO [main] mapreduce.Job: Counters: 23 File System Counters FILE: Number of bytes read=20250927 FILE: Number of bytes written=20680389 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=0 HDFS: Number of bytes written=0 HDFS: Number of read operations=0 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Map-Reduce Framework Map input records=2 Map output records=2 Input split bytes=126 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=0 CPU time spent (ms)=0 Physical memory (bytes) snapshot=0 Virtual memory (bytes) snapshot=0 Total committed heap usage (bytes)=252706816 File Input Format Counters Bytes Read=923 File Output Format Counters Bytes Written=0 2015-04-10 09:50:19,195 INFO [main] mapreduce.Job: Running job: job_local1700239962_0001 2015-04-10 09:50:19,195 INFO [main] mapreduce.Job: Job job_local1700239962_0001 running in uber mode : false 2015-04-10 09:50:19,196 INFO [main] mapreduce.Job: map 100% reduce 0% 2015-04-10 09:50:19,196 INFO [main] mapreduce.Job: Job job_local1700239962_0001 completed successfully 2015-04-10 09:50:19,200 INFO [main] mapreduce.Job: Counters: 23 File System Counters FILE: Number of bytes read=20250927 FILE: Number of bytes written=20680389 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=0 HDFS: Number of bytes written=0 HDFS: Number of read operations=0 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Map-Reduce Framework Map input records=2 Map output records=2 Input split bytes=126 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=0 CPU time spent (ms)=0 Physical memory (bytes) snapshot=0 Virtual memory (bytes) snapshot=0 Total committed heap usage (bytes)=252706816 File Input Format Counters Bytes Read=923 File Output Format Counters Bytes Written=0

BUT THE testtable doesnot have any entries in it!

What the mistake am making here,, Is there any other way of doing an export and import.

Regards, Harry

1
Is this your full output log? By the way your export and import commands, exports and imports with data for me.Rajesh N
Export is working, where as import is not working rajeshHarry
Is there any other way to export and import the hbase table?? Do u have the hbase code for export and import??Harry
I did the same thing, It is not working... Attached the logs too... Any other way is available for export and import...Harry

1 Answers

0
votes

First create a new Hbase table and then run the Import utility.

Credit goes to Rajesh N's suggestion !