4
votes

The documentation on spark-submit says the following:

The spark-submit script in Spark’s bin directory is used to launch applications on a cluster.

Regarding the pyspark it says the following:

You can also use bin/pyspark to launch an interactive Python shell.

This question may sound stupid, but when i am running the commands though pyspark they also run on the "cluster", right? They do not run on the master node only, right?

4

4 Answers

9
votes

There is no practical difference between these two. If not configured otherwise both will execute code in a local mode. If master is configured (either by --master command line parameter or spark.master configuration) corresponding cluster will be used to execute the program.

6
votes

If you are using EMR , there are three things

  1. using pyspark(or spark-shell)
  2. using spark-submit without using --master and --deploy-mode
  3. using spark-submit and using --master and --deploy-mode

although using all the above three will run the application in spark cluster, there is a difference how the driver program works.

  • in 1st and 2nd the driver will be in client mode whereas in 3rd the driver will also be in the cluster.
  • in 1st and 2nd, you will have to wait untill one application complete to run another, but in 3rd you can run multiple applications in parallel.
4
votes

Just adding a clarification that others have not addressed (you may already know this, but it was unclear from the wording of your question):

..when i am running the commands though pyspark they also run on the "cluster", right? They do not run on the master node only, right?

As with spark-submit, standard Python code will run only on the driver. When you call operations through the various pyspark APIs, you will trigger transformations or actions that will be registered/executed on the cluster.

As others have pointed out, spark-submit can also launch jobs in cluster mode. In this case, driver still executes standard Python code, but the driver is a different machine to the one that you call spark-submit from.

1
votes
  1. Pyspark compare to Scala spark and Java Spark have extreme differences, for Python spark in only support YARN for scheduling the cluster.

  2. If you are running python spark on a local machine, then you can use pyspark. If in the cluster, use the spark-submit.

  3. If you have any dependencies in your python spark job, you need a zip file for submission.