2
votes

I'm trying to install Hadoop on my laptop. I followed this guide: https://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

When I try to run start-all.sh I get this:

vava@vava-ThinkPad:/usr/local/hadoop-3.1.1/sbin$ bash start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as vava in 10 seconds.

WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
pdsh@vava-ThinkPad: localhost: rcmd: socket: Permission denied
Starting datanodes
pdsh@vava-ThinkPad: localhost: rcmd: socket: Permission denied
Starting secondary namenodes [vava-ThinkPad]
pdsh@vava-ThinkPad: vava-ThinkPad: rcmd: socket: Permission denied
Starting resourcemanager
resourcemanager is running as process 3748.  Stop it first.
Starting nodemanagers
pdsh@vava-ThinkPad: localhost: rcmd: socket: Permission denied

I tried to follow this questions but nothing changed:

starting hadoop process using start-all.sh runs into issues

Hadoop permission issue

EDIT : After I tried all the options, the only one that seems to work is export PDSH_RCMD_TYPE=ssh. Now the problem is with namenode and datanode. It doesn't start properly:

vava@vava-ThinkPad:/usr/local/hadoop-3.1.1$ sbin/start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as vava in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
pdsh@vava-ThinkPad: localhost: ssh exited with exit code 1
Starting datanodes
localhost: ERROR: Cannot set priority of datanode process 10937
pdsh@vava-ThinkPad: localhost: ssh exited with exit code 1
Starting secondary namenodes [vava-ThinkPad]
Starting resourcemanager
Starting nodemanagers
3
Did you try with sudo? sudo start-all.sh - DeshDeep Singh
If you ssh localhost, does it work without a password prompt? Secondly, that tutorial is super old. Please, please follow the official Hadoop documentation for your specific version - OneCricketeer
@DeshDeepSingh yes, nothing changed. - Luca Vavassori
If you have a new problem, then you can accept the answer that got you here, then create a new post - OneCricketeer
According to this, whatever configurations you took from that tutorial are not going to work with Hadoop3, as I mentioned before stackoverflow.com/questions/46283634/… - OneCricketeer

3 Answers

0
votes

Create a new file

/etc/pdsh/rcmd_default

write ”ssh“ to it, and then save&quit. Make sure you enter a return character and start a new line, otherwise SSH exit with code 1 will be prompted

echo "ssh" > /etc/pdsh/rcmd_default
0
votes

I would check:

  • export PDSH_RCMD_TYPE=ssh in your terminal
  • Local Firewall settings
  • Running the command as root: sudo /usr/local/hadoop-3.1.1/sbin$ bash start-all.sh
  • chmod -R 755 /usr/local/hadoop-3.1.1

For your additional question:

  • Set JAVA_HOME in hadoop-env.sh and make sure all other options are correct in this file
  • Change your user, Attempting to start all Apache Hadoop daemons as vava in 10 seconds. "vava" Is wrong, try su -l hdfs then run the script
0
votes

In my case you need to make sure to copy the RSA to the current localhost

ssh-copy-id -i /home/hadoop/.ssh/id_rsa.pub hadoop@localhost

Assuming you are logged in to the node-master with "hadoop"