11
votes

I have installed spark standalone on a set of clusters. And I tried to launch clusters through the cluster launch script. I have added cluster's IP address into conf/slaves file. The master connects to all slaves through password-less ssh. After running ./bin/start-slaves.sh script, I get the following message:

starting org.apache.spark.deploy.worker.Worker, logging to /root/spark-0.8.0-incubating/bin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-jbosstest2.out

But the webUI of the master (localhost:8080) is not showing any information about the worker. But when I add localhost entry onto my conf/slaves file the worker info of localhost is shown.

There are no error messages, the message on terminal says the worker is started, but the WebUI is not showing any workers.

5
Look at the worker's logs, it will tell you why it cannot connect to the master.Daniel Darabos

5 Answers

6
votes

I had the same problem. I noticed when I could not telnet master:port from the slaves. In my etc/hosts file (on master) I had a 127.0.0.1 master entry (before my 192.168.0.x master). When I removed the 127.0.0.1 entry from my etc/hosts file I could telnet and when I start-slaves.sh (from the master) my slaves connected

2
votes

When you run the cluster, check command $jps in worker nodes, whether its up correctly and check it in the logs with the worker's PID.

or

set the following: run the cluster and check if the ports are up or not with your configured ports

export SPARK_MASTER_WEBUI_PORT=5050
export SPARK_WORKER_WEBUI_PORT=4040
0
votes

check your /etc/hosts and see the bindings for master

If your master is binding to localhost as well as ip address (eg 192.168.x.x), remove localhost. if you have local host intact master will be mapped to localhost which wont allow slaves to connect to master Ip address

0
votes

You can use: ./start-master.sh --host 192.168.x.x instead of changing the file: /etc/hosts

0
votes

I met the same issue and finally solved by adding the following line in $SPARK_HOME/conf/spark-env.sh:

SPARK_MASTER_HOST=your_master_ip_address