I am new to Spark and learning the architecture. I understood that spark supports 3 cluster managers such as YARN, Standalone and Mesos.
In yarn cluster mode, Spark driver resides in Resource manager and executors in yarn's Containers of Node manager.
In standalone cluster mode Spark driver resides in master process and executors in slave process.
If my understanding is correct then is it required to install spark on all the node Mangers of Yarn cluster , slave nodes of standalone cluster