1
votes

Error in force(code) : Failed while connecting to sparklyr to port (8880) for sessionid (2044): Gateway in port (8880) did not respond. Path: C:\Users\user1\AppData\Local\rstudio\spark\Cache\spark-1.6.2-bin-hadoop2.6\bin\spark-submit2.cmd Parameters: --class, sparklyr.Backend, --packages, "com.databricks:spark-csv_2.11:1.3.0", "D:\Users\user1\R\R-3.3.1\library\sparklyr\java\sparklyr-1.6-2.10.jar", 8880, 2044 Traceback: shell_connection(master = master, spark_home = spark_home, app_name = app_name, version = version, hadoop_version = hadoop_version, shell_args = shell_args, config = config, service = FALSE, extensions = extensions) start_shell(master = master, spark_home = spark_home, spark_version = version, app_name = app_name, config = config, jars = spark_config_value(config, "spark.jars.default", list()), packages = spark_config_value(config, "sparklyr.defaultPackages"), extensions = extensions, environment = environment, shell_args = shell_args, service = service) tryCatch({ gatewayInfo <- spark_connect_gateway(gatewayAddress, gatewayPort, sessionId, config = config, isStarting = TRUE) }, error = function(e) { abort_shell(paste("Failed while connecting to sparklyr to port (", gatewayPort, ") for sessionid (", sessionId, "): ", e$message, sep = ""), spark_submit_path, shell_args, output_file, error_file) }) tryCatchList(expr, classes, parentenv, handlers) tryCatchOne(expr, names, parentenv, handlers[[1]]) value[3] abort_shell(paste("Failed while connecting to sparklyr to port (", gatewayPort, ") for sessionid (", sessionId, "): ", e$message, sep = ""), spark_submit_path, shell_args, output_file, error_file)

---- Output Log ---- The system cannot find the path specified.

---- Error Log ----

1

1 Answers

0
votes

The above error has been solved by changing the JAVA_HOME, Spark don’t recognise the JAVA_HOME, if \bin added to the end of the JAVA_HOME. So I removed the \bin from JAVA_HOME and it started working.