Spark Standalone:
In this mode I realized that you run your Master and worker nodes on your local machine.
Does that mean you have an instance of YARN running on my local machine? Since when I installed Spark it came with Hadoop and usually YARN also gets shipped with Hadoop as well correct? And in this mode I can essentially simulate a smaller version of a full blown cluster.
Spark Local Mode:
This is the part I am also confused on. To run it in this mode I do val conf = new SparkConf().setMaster("local[2]")
.
In this mode, it doesn't use any type of resource manager (like YARN) correct? Like it simply just runs the Spark Job in the number of threads which you provide to "local[2]"\
?