0
votes

When i tried to run my scala code which connects hbase database it works perfectly in my local IDE . But when i run the same in hadoop cluster i am getting "Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration" error .

Please help me in this

1
only getting issue when i run the jar file with spark submit - sangeeth sasidharan
i explicityly mentioned the required jars then its working - sangeeth sasidharan
spark-submit --class "processing" --master "local[*]" --jars /usr/hdp/2.6.4.0-91/hbase/lib/guava-12.0.1.jar,/usr/hdp/2.6.4.0-91/hbase/lib/hbase-protocol-1.1.2.2.6.4.0-91.jar,/usr/hdp/2.6.4.0-91/hbase/lib/hbase-client.jar,/usr/hdp/2.6.4.0-91/hbase/lib/hbase-common.jar "/home/uadmin/Sangeeth/sqlscala_2.10-1.0.jar" - sangeeth sasidharan
but why in normal case its not working - sangeeth sasidharan

1 Answers

1
votes

Add all the HBase library jars to HADOOP_CLASSPATH -

export HBASE_HOME="YOUR_HBASE_HOME_PATH"
export HADOOP_CLASSPATH="$HADOOP_CLASSPATH:$HBASE_HOME/lib/*"

You can append any external jar needed to HADOOP_CLASSPATH, so that you don't need to explicitly set it in spark-submit command. All dependent jars will be loaded and provided to your Spark application.