0
votes

I'm trying to install Hadoop and run it. And I'm sure I've installed Hadoop and formatted namenode successfully. However, when I tried to run start-dfs.sh, I got the error below:

localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-wenruo-namenode-linux.out localhost: /usr/local/hadoop/bin/hdfs: line 304: /usr/local/hadoop/usr/lib/jvm/java-8-oracle/bin/java: No such file or directory

My JAVA_HOME is below:

echo $JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64

My hadoop-env.sh file:

export JAVA_HOME=${JAVA_HOME}

How could Hadoop is still looking for JDK8 as I already set JAVA_HOME to JDK7?

Thank you very much.

2
Hard to say without seeing files ( which hadoop version? ). Check where this path comes from '/usr/local/hadoop/usr/lib/jvm/java-8-oracle/bin/java' using command grep in hadoop directory and then update your bashrc exporting your java and refresh your shells. So you will have it globally. Also add an echo for JAVA_HOME just after exporting in script.Nicolas Fontenele
@Nicolas Fontenele Hi Nicolas, as you see, I echo JAVA_HOME and am getting JDK7, not JDK8. Just no idea why hadoop is still looking for JDK8.wenruo
It proves that in the current env you have JAVA_HOME correct. But for example if you run hadoop from a different shell maybe it has exported another java-home. Did you take my suggestion to grep from java 8 and update bashrc? Also is this a single node setup? Because of you sen java home local but not in other nodes it will fail as should be.Nicolas Fontenele
@Nicolas Fontenele I found the file that reports error: /usr/local/hadoop/bin/hdfs line 304: exec "$JAVA" -Dproc_$COMMAND $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$@"wenruo

2 Answers

0
votes

In general each Hadoop distribution/version should have a few basic script files that set this JAVA_HOME environment variablesuch as yarn-env.sh file if you have yarn.

Also depending on your hadoop version you might also have the path in your *-site.xml files such as hdfs-site.xml, core-site.xml, yarn-site.xml, mapred-site.xml, and a few others depending on what services you have. It is likely your update to hadoop-env.sh did not regenerate the client configuration files unless you did it through a cluster manager application then redeployed client configuration files.

Sometimes these also I find get to set use the systems bin/java executable. You can use the following command to find out what java your OS has in your bin/ path.

readlink -f /usr/bin/java /usr/bin/java -version

Did you also update hadoop-env.sh on each node then restart all services to make sure it is picked up again?

0
votes

Leave it. The problem is resolved. In hadoop-env.sh, I changed export JAVA_HOME=${JAVA_HOME} to echo $JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64. It looks like ${JAVA_HOME} doesn't work.