2
votes

I am trying to upgrade HDFS from 1.2.1 to version 2.6. However, whenever I run start-dfs.sh -upgrade command, I get the below error:

hduser@Cluster1-NN:/usr/local/hadoop2/hadoop-2.6.0/etc_bkp/hadoop$ $HADOOP_NEW_HOME/sbin/start-dfs.sh -upgrade

15/05/17 12:45:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Starting namenodes on [nn]

Error: Please specify one of --hosts or --hostnames options and not both.

nn: starting datanode, logging to /var/hadoop/logs/hadoop-hduser-datanode-Cluster1-NN.out dn1: starting datanode, logging to /var/hadoop/logs/hadoop-hduser-datanode-Cluster1-DN1.out dn2: starting datanode, logging to /var/hadoop/logs/hadoop-hduser-datanode-Cluster1-DN2.out

Starting secondary namenodes [0.0.0.0]

Error: Please specify one of --hosts or --hostnames options and not both.

Please let me know if any of you experts have come across such error.

2

2 Answers

2
votes

I got the same problem on Arch Linux with newly installed Hadoop 2.7.1. I'm not sure whether my case is the same as yours or not, but my experience should help. I just comment out the line HADOOP_SLAVES=/etc/hadoop/slaves in /etc/profile.d/hadoop.sh and re-login. Both accessing HDFS and running streaming jobs work for me.

The cause is that the Arch-specific script /etc/profile.d/hadoop.sh declares the $HADOOP_SLAVES environment variable. And in start-dfs.sh, hadoop-daemons.sh is called with --hostnames arguments. This confuses libexec/hadoop-config.sh.

You may want to type echo $HADOOP_SLAVES as the hadoop user. If there are non-empty outputs, check your .bashrc and/or other shell startup scripts. Hope that helps :)

0
votes

Maybe it is short of some hadoop library.Can you show the detail information of namenode logs?