2
votes

I cant use sparkR in Rstudio because im getting some error: Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :

JVM is not ready after 10 seconds

I have tried to search for the solution but cant find one. Here is how I have tried to setup sparkR:

Sys.setenv(SPARK_HOME="C/Users/alibaba555/Downloads/spark")  # The path to your spark installation 

.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) 

library("SparkR", lib.loc="C/Users/alibaba555/Downloads/spark/R") # The path to the lib folder in the spark location 

library(SparkR) 

sparkR.session(master="local[*]",sparkConfig=list(spark.driver.memory="2g")*

Now execution starst with a message:

Launching java with spark-submit command C/Users/alibaba555/Downloads/spark/bin/spark-submit2.cmd
sparkr-shell C:\Users\ALIBAB~1\AppData\Local\Temp\Rtmp00FFkx\backend_port1b90491e4622

And finally after a few minutes it returns an error message:

Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, : JVM is not ready after 10 seconds

Thanks!

2
Did you ever end up with a solution? One one windows server I am able to run without error using a domain admin account, but as a regular user the error occurs even if that user is also an admin of the box and R-Studio is Run as Adminpmarsh

2 Answers

1
votes

It looks like the path to your spark library is wrong. It should be something like: library("SparkR", lib.loc="C/Users/alibaba555/Downloads/spark/R/lib")

I'm not sure if that will fix your problem, but it could help. Also, what versions of Spark/SparkR and Scala are you using? Did you build from source?

1
votes

What seemed to be causing my issues boiled down to the working directory of our users being a networked mapped drive.

Changing the working directory fixed the issue.

If by chance you are also using databricks-connect make sure that the .databricks-connect file is copied into the %HOME% of each user who will be running Rstudio or set up databricks-connect for each of them.