2
votes

I configured sparkr normally from the tutorials, and everything was working. I was able to read the database with read.df, but suddenly nothing else works, and the following error appears:

Error in sparkR.init(master = "local") : JVM is not ready after 10 seconds

Why does it appear now suddenly? I've read other users with the same problem, but the solutions given did not work. Below is my code:

Sys.setenv(SPARK_HOME= "C:/Spark")
Sys.setenv(HADOOP_HOME = "C:/Hadoop")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)

#initialeze SparkR environment
Sys.setenv('SPARKR_SUBMIT_ARGS'='"--packages" "com.databricks:spark-csv_2.11:1.2.0" "sparkr-shell"')
Sys.setenv(SPARK_MEM="4g")

#Create a spark context and a SQL context
sc <- sparkR.init(master = "local")
sqlContext <- sparkRSQL.init(sc)
2
Understanding that your setup is done rightly in windows, then this is a regular phenomenon in windows when multiple other things/programs are running simultaneously; herein, JVM in first instance will not get enough memory, and it starts dispalying the above one. If your re-run the context, it will work fine, make sure you have at least 500 MB memory is not in use, otherwise close few other things that are consuming more memory.pmavuluri

2 Answers

0
votes

Try to do few things below:

  1. Check if c:/Windows/System32/ is there in the PATH.

  2. Check if spark-submit.cmd has proper execute permissions.

  3. If both the above things are true and even if it is giving the same error, then delete spark directory and again create a fresh one by unzipping spark gzip file.

0
votes

I'm a beginner of R, and I have solved the same problem "JVM is not ready after 10 seconds" by installing JDK(version 7+) before installing sparkr in my mac. And it works well now. Hope this can help you with your problem.