I was trying to run spark-submit and I get "Failed to find Spark assembly JAR. You need to build Spark before running this program." When I try to run spark-shell I get the same error. What I have to do in this situation.
9 Answers
Your Spark package doesn't include compiled Spark code. That's why you got the error message from these scripts spark-submit and spark-shell.
You have to download one of pre-built version in "Choose a package type" section from the Spark download page.
Go to
SPARK_HOME. Note that your SPARK_HOME variable should not include/binat the end. Mention it when you're when you're adding it to path like this:export PATH=$SPARK_HOME/bin:$PATHRun
export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=1g"to allot more memory to maven.Run
./build/mvn -DskipTests clean packageand be patient. It took my system 1 hour and 17 minutes to finish this.Run
./dev/make-distribution.sh --name custom-spark --pip. This is just for python/pyspark. You can add more flags for Hive, Kubernetes, etc.
Running pyspark or spark-shell will now start pyspark and spark respectively.
Spark Installation:
For Window machine:
Download spark-2.1.1-bin-hadoop2.7.tgz from this site https://spark.apache.org/downloads.html Unzip and Paste your spark folder in C:\ drive and set environment variable. If you don’t have Hadoop, you need to create Hadoop folder and also create Bin folder in it and then copy and paste winutils.exe file in it. download winutils file from [https://codeload.github.com/gvreddy1210/64bit/zip/master][1] and paste winutils.exe file in Hadoop\bin folder and set environment variable for c:\hadoop\bin; create temp\hive folder in C:\ drive and give the full permission to this folder like: C:\Windows\system32>C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive open command prompt first run C:\hadoop\bin> winutils.exe and then navigate to C:\spark\bin> run spark-shell
