2
votes

I'm running Hbase 1.0.1 standalone on Apache Hadoop 2.7 cluster environment. I'm getting below issue while running a simple Map Reduce Job on Hbase.

Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://hdmaster:9000/usr/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1309)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
..
..
..
..

My Hadoop and Hbase setup seems fine since I can run the wordcount example and also can Put/Get Hbase tables without any issues.

It doesn't seem as though I'm missing any settings for running Mapreduce on Hbase. I have also tried with:

export HADOOP_CLASSPATH="$HBASE_HOME/lib/hadoop-client-2.5.1.jar:$HBASE_HOME/lib/hbase-common-1.0.1.jar:$HBASE_HOME/lib/protobuf-java-2.5.0.jar:$HBASE_HOME/lib/guava-12.0.1.jar:$HBASE_HOME/lib/zookeeper-3.4.6.jar:$HBASE_HOME/lib/hbase-protocol-1.0.1.jar"

1

1 Answers

3
votes

Resolved it- posting if anyone comes across this same issue.

The Hadoop jars version in $HBASE_HOME/lib was not matching the hadoop installation jars. So I replaced them with the required version. Restarted HDFS, YARN and HBase.

Ran the MR job using the below command:

$ HADOOP_CLASSPATH=$(hbase classpath) hadoop jar MyJob.jar MyJobMainClass