0
votes

I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluster terminates, I am unable to see those logs. I get below screen

databricks cluster after completion

I am unable to access the spark UI (last tab). Is there any way I can get the executor logs after cluster is terminated just like we can download the driver logs?

1

1 Answers

0
votes

Hope this will help you --- More details here

  1. Click on Jobs

  2. Click the job you want to see logs for

  3. Click "Logs". This will show you driver logs.

For executor logs, the process is a bit more involved:

  1. Click on Clusters

  2. Choose the cluster in the list corresponding to the job

  3. Click Spark UI

  4. Now you have to choose the worker for which you want to see logs. Click the nodes list (it's on the far right, next to "Apps") and then you can click stdout or stderr to see the logs