Here is my spark cluster details - Memory - 29.3GB and 10 cores.
Now I run this job,
spark-submit --master spark://hadoop-master:7077 --executor-memory 1g -- executor-cores 2 /home/hduser/ratings-counter.py
but when I click the completed application, I see 5 executors being executed.
How does spark determines to execute 5 executors?