2
votes

I tried installing Apache Spark on my 64 bit Windwos 7 machine.

I used the guides -

  1. Installing Spark on Windows 10

  2. How to run Apache Spark on Windows 7

  3. Installing Apache Spark on Windows 7 environment

This is what I did -

  1. Install Scala Set environment variable SCALA_HOME and add %SCALA_HOME%\bin to Path Result: scala command works on command prompt

  2. Unpack pre-built Spark Set environment variable SPARK_HOME and add %SPARK_HOME%\bin to Path

  3. Download winutils.exe Place winutils.exe under C:/hadoop/bin Set environment variable HADOOP_HOME and add %HADOOP_HOME%\bin to Path

I already have JDK 8 installed.

Now, the problem is, when I run spark-shell from C:/spark-2.1.1-bin-hadoop2.7/bin, I get this -

"C:\Program Files\Java\jdk1.8.0_131\bin\java" -cp "C:\spark-2.1.1-bin-hadoop2.7\conf\;C:\spark-2.1.1-bin-hadoop2.7\jars\*" "-Dscala.usejavacp=true" -Xmx1g org spark.repl.Main --name "Spark shell" spark-shell

Is it an error? Am I doing something wrong?

Thanks!

1

1 Answers

0
votes

I have the same issue when trying to install Spark local with Windows 7. Please make sure the below paths is correct and I am sure I will work with you.

  • Create JAVA_HOME variable: C:\Program Files\Java\jdk1.8.0_181\bin
  • Add the following part to your path: ;%JAVA_HOME%\bin
  • Create SPARK_HOME variable: C:\spark-2.3.0-bin-hadoop2.7\bin
  • Add the following part to your path: ;%SPARK_HOME%\bin
  • The most important part Hadoop path should include bin file before winutils.ee as the following: C:\Hadoop\bin Sure you will locate winutils.exe inside this path.
  • Create HADOOP_HOME Variable: C:\Hadoop
  • Add the following part to your path: ;%HADOOP_HOME%\bin

Now you can run the cmd and write spark-shell it will work.