1
votes

I setup a spark ec2 cluster using the bin/spark-ec2 script. When I ssh to the master node and run spark-submit on an example program, I see the following error from all Executors, and each Executor is marked FAILED:

(java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory)

The strange part is why spark is looking for java-1.7.0-openjdk-1.7.0.85.x86_64. I have JAVA_HOME set to: /usr/lib/jvm/jre-1.8.0-openjdk. I even grepped recursively for openjdk-1.7.0.85 and came up with nothing. So Why is spark-submit trying to use a seemingly random version of Java, that isn't even installed on the master nor the slaves?

The full output follows:

[ec2-user@ip-172-31-35-149 spark]$ sudo ./bin/spark-submit --class org.apache.spark.examples.mllib.LinearRegression lib/spark-examples-1.4.1-hadoop1.0.4.jar  data/mllib/sample_linear_regression_data.txt
15/08/18 18:26:46 INFO spark.SparkContext: Running Spark version 1.4.1
15/08/18 18:26:46 INFO spark.SecurityManager: Changing view acls to: root
15/08/18 18:26:46 INFO spark.SecurityManager: Changing modify acls to: root
15/08/18 18:26:46 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/08/18 18:26:47 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/08/18 18:26:47 INFO Remoting: Starting remoting
15/08/18 18:26:47 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:35948]
15/08/18 18:26:47 INFO util.Utils: Successfully started service 'sparkDriver' on port 35948.
15/08/18 18:26:47 INFO spark.SparkEnv: Registering MapOutputTracker
15/08/18 18:26:47 INFO spark.SparkEnv: Registering BlockManagerMaster
15/08/18 18:26:47 INFO storage.DiskBlockManager: Created local directory at /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/blockmgr-24b5de9d-8496-44e8-8806-b091ece651f0
15/08/18 18:26:47 INFO storage.DiskBlockManager: Created local directory at /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8/blockmgr-b9fc6955-b7c4-46aa-8309-23d51082ef51
15/08/18 18:26:47 INFO storage.MemoryStore: MemoryStore started with capacity 265.1 MB
15/08/18 18:26:47 INFO spark.HttpFileServer: HTTP File server directory is /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/httpd-664cbb8f-3599-49b3-8f83-fceb00b5ca7e
15/08/18 18:26:47 INFO spark.HttpServer: Starting HTTP Server
15/08/18 18:26:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/18 18:26:47 INFO server.AbstractConnector: Started [email protected]:43864
15/08/18 18:26:47 INFO util.Utils: Successfully started service 'HTTP file server' on port 43864.
15/08/18 18:26:47 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/08/18 18:26:48 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/18 18:26:48 INFO server.AbstractConnector: Started [email protected]:4040
15/08/18 18:26:48 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/08/18 18:26:48 INFO ui.SparkUI: Started SparkUI at http://ec2-54-187-197-56.us-west-2.compute.amazonaws.com:4040
15/08/18 18:26:48 INFO spark.SparkContext: Added JAR file:/root/spark/lib/spark-examples-1.4.1-hadoop1.0.4.jar at http://172.31.35.149:43864/jars/spark-examples-1.4.1-hadoop1.0.4.jar with timestamp 1439922408595
15/08/18 18:26:48 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://[email protected]:7077/user/Master...
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150818182649-0015
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/0 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/0 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/1 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/1 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/0 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/1 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/0 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/0"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/0 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/0"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 0
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/2 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/2 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/1 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/1"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/1 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/1"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 1
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/3 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/3 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/2 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/3 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/2 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/2"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/2 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/2"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 2
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/4 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/4 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/3 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/3"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/3 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/3"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 3
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/5 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/5 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/4 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/4 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/4 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/4"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 4
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/6 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/6 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/5 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/5 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/5"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/5 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/5"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 5
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/7 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/7 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/6 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/6 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/6"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/6 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/6"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 6
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/8 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/8 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/7 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/8 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/7 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/7"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/7 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/7"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 7
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/9 on worker-20150812175308-172.31.33.90-48856 (172.31.33.90:48856) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/9 on hostPort 172.31.33.90:48856 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/8 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/8"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/8 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/8"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 8
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor added: app-20150818182649-0015/10 on worker-20150812175308-172.31.33.91-56792 (172.31.33.91:56792) with 2 cores
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150818182649-0015/10 on hostPort 172.31.33.91:56792 with 2 cores, 6.0 GB RAM
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/9 is now LOADING
15/08/18 18:26:49 INFO client.AppClient$ClientActor: Executor updated: app-20150818182649-0015/9 is now FAILED (java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/9"): error=2, No such file or directory)
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Executor app-20150818182649-0015/9 removed: java.io.IOException: Cannot run program "/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/bin/java" (in directory "/root/spark/work/app-20150818182649-0015/9"): error=2, No such file or directory
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Asked to remove non-existent executor 9
15/08/18 18:26:49 ERROR cluster.SparkDeploySchedulerBackend: Application has been killed. Reason: Master removed our application: FAILED
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/08/18 18:26:49 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/08/18 18:26:49 INFO ui.SparkUI: Stopped Spark web UI at http://ec2-54-187-197-56.us-west-2.compute.amazonaws.com:4040
15/08/18 18:26:49 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
15/08/18 18:26:49 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
15/08/18 18:26:49 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34521.
15/08/18 18:26:49 INFO netty.NettyBlockTransferService: Server created on 34521
15/08/18 18:26:49 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/08/18 18:26:49 INFO storage.BlockManagerMasterEndpoint: Registering block manager 172.31.35.149:34521 with 265.1 MB RAM, BlockManagerId(driver, 172.31.35.149, 34521)
15/08/18 18:26:49 INFO storage.BlockManagerMaster: Registered BlockManager
15/08/18 18:26:49 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
    at org.apache.spark.examples.mllib.LinearRegression$.run(LinearRegression.scala:92)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:84)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:83)
    at scala.Option.map(Option.scala:145)
    at org.apache.spark.examples.mllib.LinearRegression$.main(LinearRegression.scala:83)
    at org.apache.spark.examples.mllib.LinearRegression.main(LinearRegression.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/18 18:26:49 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
    at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:103)
    at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1503)
    at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2007)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:543)
    at org.apache.spark.examples.mllib.LinearRegression$.run(LinearRegression.scala:92)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:84)
    at org.apache.spark.examples.mllib.LinearRegression$$anonfun$main$1.apply(LinearRegression.scala:83)
    at scala.Option.map(Option.scala:145)
    at org.apache.spark.examples.mllib.LinearRegression$.main(LinearRegression.scala:83)
    at org.apache.spark.examples.mllib.LinearRegression.main(LinearRegression.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/18 18:26:49 INFO storage.DiskBlockManager: Shutdown hook called
15/08/18 18:26:49 INFO util.Utils: path = /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/blockmgr-24b5de9d-8496-44e8-8806-b091ece651f0, already present as root for deletion.
15/08/18 18:26:49 INFO util.Utils: path = /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8/blockmgr-b9fc6955-b7c4-46aa-8309-23d51082ef51, already present as root for deletion.
15/08/18 18:26:49 INFO util.Utils: Shutdown hook called
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789/httpd-664cbb8f-3599-49b3-8f83-fceb00b5ca7e
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt2/spark/spark-350c9f91-21f0-49f6-a1c1-bc32befdb3e8
15/08/18 18:26:49 INFO util.Utils: Deleting directory /mnt/spark/spark-7d250ff1-595c-4217-98e1-e8be34868789
1
how did you compile your class? maybe the compiler has java 1.8 as target?jimijazz

1 Answers

1
votes

I upgraded Java from java-1.7.0-openjdk-1.7.0.85.x86_64 to 1.8. I forgot to bounce my Spark workers. Therefore, the workers had the path from when they were started, which was from before the upgrade to 1.8.