0
votes

Today I tried to install Hadoop on my Mac OS X Lion following the instructions at Setting up Hadoop 2.4 and Pig 0.12 on OSX locally

I have correctly set the

JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_71.jdk/Contents/Home set both ~/.bash_profile and .bashrc

and successfully installed the latest version of Hadoop (2.6.0) using brew and edited those 4 configuration files: hdfs.site.xml, core-site.xml, mapred-site.xml, yarn-site.xml accordingly.

But running:

./bin/hdfs namenode -format

gives:

15/01/29 17:42:01 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = Venuses-Mac-mini.local/192.168.1.51
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.6.0
STARTUP_MSG:   classpath = /usr/local/Cellar/hadoop/2.6.0/libexec    /etc/hadoop:/usr/local/Cellar/hadoop/2.6.0/libexec/share/hadoop/common    /lib/activation-1.1.jar <TRUNCATED - Big Chunk of Code Containing .jar Filenames> 
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos    /asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG:   java = 1.6.0_29
<TRUNCATED - Big Chunk of .jar Filnames TRUNCATED>
************************************************************/
15/01/29 17:42:01 INFO namenode.NameNode: registered UNIX signal    handlers for [TERM, HUP, INT]
15/01/29 17:42:01 INFO namenode.NameNode: createNameNode [-format]
2015-01-29 17:42:02.551 java[1016:1903] Unable to load realm info     from SCDynamicStore
15/01/29 17:42:02 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-aaa7a5a6-3e82-4166-8039-16046f1b4761
<TRUNCATED>
15/01/29 17:42:03 ERROR namenode.FSNamesystem: FSNamesystem initialization failed.
org.apache.hadoop.HadoopIllegalArgumentException: An XAttr name must  be  prefixed with user/trusted/security/system/raw, followed by a '.'
at org.apache.hadoop.hdfs.XAttrHelper.buildXAttr(XAttrHelper.java:72)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.<init>(FSDirectory.java:137)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:894)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:755)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:934)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1379)
at   org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1504)
15/01/29 17:42:03 INFO namenode.FSNamesystem: Stopping services started for active state
15/01/29 17:42:03 INFO namenode.FSNamesystem: Stopping services started for standby state
15/01/29 17:42:03 FATAL namenode.NameNode: Failed to start namenode.
<TRUNCATED>
15/01/29 17:42:03 INFO util.ExitUtil: Exiting with status 1
15/01/29 17:42:03 INFO namenode.NameNode: SHUTDOWN_MSG: 

2 versions of Java is installed on my Mac Hadoop takes the older version, 1.6.0_29, not the current version, 1.7.0_72. I don't know hot to make Hadoop take the current Java version into account.

NOTE: I 've made an extensive search on Google and could not find a solution for this particular error.

Thanks.

1

1 Answers

0
votes

What do you get when you run the following in a Terminal shell?

/usr/libexec/java_home

If that returns your 1.6 JDK, then it could be that somewhere hadoop is using that command to determine which java to use. For example, one place that might be happening, is the file '/usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hadoop-env.sh'. Line 25 is:

export JAVA_HOME="$(/usr/libexec/java_home)"

Try changing that to:

export JAVA_HOME="$(/usr/libexec/java_home -v1.7)"

In order to set JAVA_HOME to your 1.7 JDK. Or it could be that some other hadoop file is doing a similar thing to find Java.