0
votes

I created a virtual machine running an ubuntu server 16.04. I've already installed spark and all dependencies & prerequisites. My Spark cluster is running on the VM and all workers and the master can be started by start-all.sh.
Now I'm trying to submit sparkR jobs to this cluster by using Rstudio from my local computer. I specified the sparkContext with master="spark://192.168.0.105:7077" to connect to the cluster, which is obviously running, when calling the IP:8080 master webUI. Is there any config, that has been specified, to call the master from another device, which is not part of the cluster yet?

The error in R is:

Error in handleErrors(returnStatus, conn) : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

1
Have you installed sparklyr on your local computer?Steven Black
i have got sparklyr and sparkR and both are not executing my jobsJens Englert
8080 is usually Ambari, which usually means HDP & Yarn... if so try master="yarn://192.168.0.105:7077"Steven Black
I think it's master="yarn-client://192.168.0.105:7077" for spark version < 2Steven Black
spark://192.168.0.105:7077 did work to connect to the spark master, the cluster manager master is distributing the jobs, but they never get executed.Jens Englert

1 Answers