I am trying to setup a small spark cluster on my local Mac machine, one master and two or more workers. In Spark 2.0.0 doc there is a property SPARK_WORKER_INSTANCES which states
Number of worker instances to run on each machine (default: 1). You can make this more than 1 if you have very large machines and would like multiple Spark worker processes. If you do set this, make sure to also set
SPARK_WORKER_CORESexplicitly to limit the cores per worker, or else each worker will try to use all the cores.
However, this same property is missing from Spark 2.4