0
votes

Why does Spark UI show only 6 cores available per worker (not the number of cores used) while I have 16 on each of my 3 machines (8 sockets * 2 cores/socket) or even 32 if take in account the number of threads per core (2). I tried to set SPARK_WORKER_CORES in spark-env.sh file but it changes nothing (I made the changes on all 3 workers). I also comment the line to see if it changes something: number of cores available is always stuck at 6.

I'm using Spark 2.2.0 in standalone cluster:

pyspark --master spark://myurl:7077 

enter image description here

result of lscpu command: enter image description here

1

1 Answers

0
votes

I've found that I simply had to stop the master and slaves and restart them so the parameter SPARK_WORKER_CORES is refreshed.