0
votes

I have a standalone hadoop setup using VMware workstation. I can start hadoop using the following commands ./start-dfs.sh ./start-yarn.sh

Here is the hdfs-site.xml

 <property>
  <name>dfs.namenode.name.dir</name>
  <value>file:/home/hduser/mydata/hdfs/namenode</value>
 </property>
 <property>
  <name>dfs.datanode.data.dir</name>
  <value>file:/home/hduser/mydata/hdfs/datanode</value>
 </property>

Now, I have been given a separate dfs directory that I want to point to. Note that this dfs directory contains dirs such as namenode, datanode and secondarydatanode. Im using this article as reference (https://community.hortonworks.com/articles/2308/how-to-move-or-change-the-hdfs-datanode-directorie.html)

But after configuring hdfs-site.xml to the new node (in the path mynewdatanode) as the following

 <property>
  <name>dfs.namenode.name.dir</name>
  <value>file:/home/hduser/mynewdata/hdfs/namenode</value>
 </property>
 <property>
  <name>dfs.datanode.data.dir</name>
  <value>file:/home/hduser/mynewdata/hdfs/datanode</value>
 </property>

I started hadoop with the above config, but I see a connection refused error

hduser@ubuntu:/usr/local/hadoop/sbin$ hdfs dfs -ls /wordcount/
ls: Call From ubuntu/192.168.52.143 to localhost:9000 failed on connection  exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Any pointers please? Much appreciated.

1
Have you configured passwordless ssh..?BruceWayne

1 Answers

0
votes

formatting the namenode fixes this problem. Here is the command /bin/hadoop namenode -format