0
votes

I write a shell script that calls spark-sumit to submit an application to yarn with yarn cluster mode. When the application starts up, will the client process(process that submits application ) be terminated?

That is, when the application starts up and I shut down the machine on which client process runs, will the running application be unaffected and keep running?

1

1 Answers

2
votes

If --deploy-mode is cluster, Driver will be running in yarn cluster & Spark job will be running even if client machine where job triggers goes down.

If --deploy-mode is client, Driver will be running in client machine where job triggers, Spark job will be killed if client machine goes down.

application started client process will not be terminated both cases, It will simply keep on printing logs on console.