21
votes

I followed "http://codesfusion.blogspot.com/2013/10/setup-hadoop-2x-220-on-ubuntu.html" to install hadoop on ubuntu. But, upon checking the hadoop version I get the following error:

Error: Could not find or load main class org.apache.hadoop.util.VersionInfo

Also, when I try: hdfs namenode -format

I get the following error:

Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

The java version used is:

java version "1.7.0_25"
OpenJDK Runtime Environment (IcedTea 2.3.10) (7u25-2.3.10-1ubuntu0.12.04.2)
OpenJDK 64-Bit Server VM (build 23.7-b01, mixed mode)
10
My path is set. I can't figure out what's wrong.usb
Don't they have prepackaged binaries? That is usually the way to go.yǝsʞǝla
@AlekseyIzmailov - it isn't with Java applications. Certainly not these days.Stephen C
I don't have Ubuntu here, but I have these packages on Fedora: $ yum search hadoop gives: hadoop-client.noarch, hadoop-common.noarch, hadoop-hdfs.noarch, hadoop-mapreduce.noarch and bunch of other things.yǝsʞǝla

10 Answers

17
votes

It is a problem of environmental variables setup. Apparently, I didnt find one which can work until NOW. I was trying on 2.6.4. Here is what we should do

export HADOOP_HOME=/home/centos/HADOOP/hadoop-2.6.4
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_CONF_DIR=$HADOOP_HOME
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop

Add these into your .bashrc and dont forget to do

source ~/.bashrc

I think your problem will be solved as was mine.

7
votes

You probably did not follow the instructions correctly. Here are some things to try and help us / you diagnose this:

  • In the shell that you ran hadoop version, run export and show us the list of relevant environment variables.

  • Show us what you put in the /usr/local/hadoop/etc/hadoop/hadoop-env.sh file.

  • If neither of the above gives you / us any clues, then find and use a text editor to (temporarily) modify the hadoop wrapper shell script. Add the line "set -xv" somewhere near the beginning. Then run hadoop version, and show us what it produces.

3
votes

Adding this line to ~/.bash_profile worked for me.

export HADOOP_PREFIX=/<where ever you install hadoop>/hadoop

So just:

  1. $ sudo open ~/.bash_profile then add the aforesaid line
  2. $ source ~/.bash_profile

Hope this helps (:

3
votes

I was facing the same issue. Although it may seem so simple but took away 2 hrs of my time. I tried all the things above but it didn't help.

I just exit the shell i was in and tried again by logging into the system again. Then things worked!

1
votes

Try to check:

  • JAVA_HOME, all PATH related variables in Hadoop config
  • run: . ~/.bashrc (note the dot in front) to make those variables available in your environment. It seems that the guide does not mention this.
1
votes

I got the same problem with hadoop 2.7.2 after I applied the trick shown I was able to start hdfs but later I discovered that the tar archivie I was using was missing some important pieces. So downloading the 2.7.3 everything worked as it is supposed to work.

My first suggestion is to download again the tar.gz at the same version or major.

If you are continuing to reading... this how I solved the problem... After a fresh install hadoop was not able to find the jars. I did this small trick:

I located where the jars are
I did a symbolic link of the folder to $HADOOP_HOME/share/hadoop/common

ln -s $HADOOP_HOME/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib $HADOOP_HOME/share/hadoop/common 

for version command you need hadoop-common-2.7.2.jar, this helped me to find where the jars where stored.

After that...

$ bin/hadoop version 
Hadoop 2.7.2
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r b165c4fe8a74265c792ce23f546c64604acf0e41
Compiled by jenkins on 2016-01-26T00:08Z
Compiled with protoc 2.5.0
From source with checksum d0fda26633fa762bff87ec759ebe689c
This command was run using /opt/hadoop-2.7.2/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/hadoop-common-2.7.2.jar

Of course any hadoop / hdfs command works now.

I'm again an happy man, I know this is not a polite solution but works at least for me.

1
votes

I got that error , I fixed that by editing ~/.bashrc as follow

export HADOOP_HOME=/usr/local/hadoop
export PATH=$HADOOP_HOME/bin:$PATH

then open terminal and write this command

source ~/.bashrc

then check

hadoop version
1
votes

Here is how it works for Windows 10 Git Bash (mingw64):

export HADOOP_HOME="/PATH-TO/hadoop-3.3.0"
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_CLASSPATH=$(cygpath -pw $(hadoop classpath)):$HADOOP_CLASSPATH
hadoop version

copied slf4j-api-1.6.1.jar into hadoop-3.3.0\share\hadoop\common

0
votes

I added the environment variables described above but still didn't work. Setting the HADOOP_CLASSPATH as follows in my ~/.bashrc worked for me:

export HADOOP_CLASSPATH=$(hadoop classpath):$HADOOP_CLASSPATH

-2
votes

I used

export PATH=$HADOOP_HOME/bin:$PATH

Instead of

export PATH=$PATH:$HADOOP_HOME/bin

Then it worked for me!