0
votes

I am new to Hadoop and trying to install Hadoop on multinode cluster on ubuntu 14.04-Server on VM. All goes well until I try to list the files within HDFS using hadoop fs -ls /

I keep getting an error:

ls: unknown host: Hadoop-Master.

Initially I thought I made some mistake in assigning the hostname but cross-checked with /etc/hosts and /etc./hostname. Hostname is listed correctly as Hadoop-Master. Removed hostname altogether. Only ip address remaining.

Another post here suggested to add two lines to .bashrc:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib

I tried doing that but still getting the same error. Please find the relevant steps below along with edits based on information asked.

  • Check IP address of the master with ifconfig
  • Add to the /etc/hosts and edit the /etc/hostname to add the host name.
  • Add the relevant details to masters and slaves.

.bashrc File

export HADOOP_INSTALL=/usr/local/hadoop
export PIG_HOME=/usr/local/pig
export HIVE_HOME=/usr/local/Hive

export PATH=$PATH:$HADOOP_INSTALL/bin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

Java path export JAVA_HOME='/usr/lib/jvm/java-7-oracle'

core-site.xml

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs:Hadoop-Master:9001</value>
</property>
</configuration>

hadoop-env.sh

export JAVA_HOME='/usr/lib/jvm/java-7-oracle'

Edit mapred-site.xml to include the hostname and change the value to no. of nodes present. mapred-site.xml

<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>2</value>
</property>
</configuration>

Edit hdfs-site.xml, changed the value to no. of data nodes present.​ hdfs-site.xml

<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/hduser/mydata/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/home/hduser/mydata/hdfs/datanode</value>
</property>
</configuration>

whoami simplilearn

/etc/hosts localhost 127.0.0.1 Hadoop-Master 192.168.207.132 Hadoop-Slave 192.168.207.140

/etc/hostname Hadoop-Master

1
Try removing - from Hadoop-Master in /etc/hosts.Rajesh N
I did that, but same error: Warning: $HADOOP_HOME is deprecated. ls: Unknown host: Hadoop-Master...Vaibhav Bhatnagar
Could you post the error now for hadoop fs -ls /?Rajesh N
No change. ls: Unknown host: Hadoop-Master.Vaibhav Bhatnagar
@RajeshN Well, there is no change at all. "$HADOOP_HOME is deprecated. ls: Unknown host: Hadoop-Master"Vaibhav Bhatnagar

1 Answers

0
votes

Changes to be made:

1. /etc/hosts file:

Change Hadoop-Master to HadoopMaster

2. /etc/hostname file:

Change Hadoop-Master to HadoopMaster

3. core-site.xml:

Change this

hdfs:Hadoop-Master:9001

to this

hdfs://HadoopMaster:9001

NOTE: Change Hadoop-Master to HadoopMaster in all nodes pointing to your IP. Change slaves and master files too.