I'm setting up a cluster with Hortnworks (HDP 2.4). I have a 4 nodes cluster each having (16Gb-RAM, 8-CPUs). I also have Spark installed with Zeppelin Notebook in order to use python (pyspark).
My problem is: I started with a configuration of 3 nodes and later I added another new node (so totally 4 as said before), anyway the number of executors on Spark remains "3".
I see on the web that the number of executors is settable in SPARK_EXECUTOR_INSTANCES
, but this param is present only in spark-env template
of the config page of Spark in Ambari UI. Seems it demand to YARN the decision about executors, but in YARN I haven't found anything about this.
Definitively, How I can increase the number of executor in my Hortonworks Hadoop Cluster using Ambari?