0
votes

I am running a spark structured streaming application. I have assigned 10gb to driver. The program runs fine for 8 hours then it give error like following. Executor finished task and send result to driver then driver command shutdown WHY?? How much memory driver needs?

20/04/18 19:25:24 INFO CoarseGrainedExecutorBackend: Got assigned task 489524 20/04/18 19:25:24 INFO Executor: Running task 1000.0 in stage 477.0 (TID 489524) 20/04/18 19:25:25 INFO Executor: Finished task 938.0 in stage 477.0 (TID 489492). 4153 bytes result sent to driver 20/04/18 19:25:25 INFO Executor: Finished task 953.0 in stage 477.0 (TID 489499). 3687 bytes result sent to driver 20/04/18 19:25:28 INFO Executor: Finished task 1000.0 in stage 477.0 (TID 489524). 3898 bytes result sent to driver 20/04/18 19:25:29 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown 20/04/18 19:25:29 INFO MemoryStore: MemoryStore cleared 20/04/18 19:25:29 INFO BlockManager: BlockManager stopped 20/04/18 19:25:29 INFO ShutdownHookManager: Shutdown hook called

1

1 Answers

0
votes

There is no specific memory limit of driver. The driver can take up to 40 GB of memory beyond that the JVM GC causes it to slowdown.

In your case, it looks like driver is getting overwhelmed by the results sent by all the executors to it.

There are few things you can try

  1. Please ensure there are no collect operation in the driver. That will definitely cause the driver to overwhelm.

  2. try adding more memory of driver maybe 18G.

  3. Increase spark.yarn.driver.memoryOverhead to 2G : This is the amount of off-heap memory (in megabytes) to be allocated per driver.