1
votes

I am using Spark 1.5.2 running on Yarn.

Sometimes, the Spark web UI for an application becomes empty after it finishes, why ? How can I resolve it ?

For example, today I launched a pyspark app, and followed it's progress on the console and the Spark Web UI. So I know that it ran correctly as you can see on the screeshot below :

But after the process, the history becomes empty :

When the script finished, I got the following warning on the console (I don't know if it is linked to the issue):

16/05/02 17:36:49 WARN AkkaRpcEndpointRef: Error sending message [message = RemoveExecutor(2,Yarn deallocated the executor 2 (container container_1460361870585_2312_01_000004))] in 1 attempts org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/CoarseGrainedScheduler#313705968]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout [...] 16/05/02 17:36:51 WARN ReliableDeliverySupervisor: Association with remote system [...] has failed, address is now gated for [5000] ms. Reason: [Disassociated] 16/05/02 17:36:51 WARN ReliableDeliverySupervisor: Association with remote system [...] has failed, address is now gated for [5000] ms. Reason: [Disassociated]

1
I don't know why does it actually happen however you always can save event_log and then visualize it (like it was during computation) using history-server.Hlib

1 Answers

0
votes

From the second screenshot, it was the Jobs page. You need to click the job id link in order to see a job DAG.