How to utilize all cores and memory on the spark standalone cluster below:
Node 1: 4cores 8gb memory
Node 2: 4cores 16gb memory
Currently I can allocate to use:
A) 8 cores and 14 gb of memory by setting:
.config('spark.executor.memory','7g')
.config('spark.executor.cores', '4')
Cores | Memory
----------------------------------
4 (4 Used) | 15.0 GiB (7.0 GiB Used)
4 (4 Used) | 7.0 GiB (7.0 GiB Used)
B) To use 6 cores and 21gb of memory by setting:
.config('spark.executor.memory','7g')
.config('spark.executor.cores', '2')
Cores | Memory
----------------------------------
4 (4 Used) | 15.0 GiB (14.0 GiB Used)
4 (2 Used) | 7.0 GiB (7.0 GiB Used)
Expected output:
8 cores 21gb of memory:
Cores | Memory
----------------------------------
4 (4 Used) | 15.0 GiB (14.0 GiB Used)
4 (4 Used) | 7.0 GiB (7.0 GiB Used)
ref: