3
votes

I am trying to run spark-shell command on cmd prompt on Windows 7. I have installed hadoop and have kept it under C:\winutils\hadoop-common-2.2.0-bin-master\bin and Spark under C:\Spark\spark-2.2.1-bin-hadoop2.7\bin.

While executing spark-shell, i am getting following error.

C:\Spark\spark-2.2.1-bin-hadoop2.7\bin>spark-shell The system cannot find the path specified.

Below are my env variables

HADOOP_HOME C:\winutils

JAVA_HOME   C:\Program Files\IBM\Java80\jre

PATH        C:\Users\IBM_ADMIN\AppData\Local\Programs\Python\Python36-32;C:\IBM\InformationServer\Clients\Classic;C:\Program Files\IBM\Java80\jre;C:\Windows\system32

SCALA_HOME  C:\Program Files (x86)\scala\

Screenshot

Screenshot

3
where did you install spark? I see on your pathAssaf Mendelson
C:\Spark\spark-2.2.1-bin-hadoop2.7\bin -- > Apache Spark pathPallavi
C:\Spark\spark-2.2.1-bin-hadoop2.7\bin>spark-shell The system cannot find the path specified.Pallavi
you don't have "." in the path either. Try C:\Spark\spark-2.2.1-bin-hadoop2.7\bin\spark-shellAssaf Mendelson
where to put "." ? under bin folder i have spark-shell.cmdPallavi

3 Answers

3
votes

Your JAVA_HOME is set to JRE, please make sure you point it to your JDK folder (it should be located next to your the JRE)

3
votes

I have the same issue when trying to install Spark local with Windows 7. Please make sure the below paths is correct and I am sure I will work with you.

  1. Create JAVA_HOME variable: C:\Program Files\Java\jdk1.8.0_181
  2. Add the following part to your path: ;%JAVA_HOME%\bin
  3. Create SPARK_HOME variable: C:\spark-2.3.0-bin-hadoop2.7
  4. Add the following part to your path: ;%SPARK_HOME%\bin
  5. The most important part Hadoop path should include bin file before winutils.ee as the following: C:\Hadoop\bin Sure you will locate winutils.exe inside this path.
  6. Create HADOOP_HOME Variable: C:\Hadoop
  7. Add the following part to your path: ;%HADOOP_HOME%\bin

Now you can run the cmd and write spark-shell it will work.

1
votes

I had the same issue when using Apache Spark on Windows 10 Pro.

NB:

  1. Uninstall any JAVA JDK above 8 (jdk1.8.0_181)--11-16 caused the problem.

  2. Test the Apache File using 'certutil -hashfile c:\users\username\Downloads\spark-2.7.5-bin-hadoop2.7.tgz SHA512'. Remember to replace 'username' with for instance "certutil -hashfile c:\users*datamind*\Downloads\spark-2.4.5-bin-hadoop2.7.tgz SHA512"

  3. Search for 'Edit Environment Variables'.

  4. C:\Program Files\Java\jdk1.8.0_181

  5. Click on the Path in 'User Variable' ;%JAVA_HOME%\bin

  6. Repeat STEPS 2 and 3 for HADOOP_HOME and JAVA_HOME.

Kindly follow this link and do everything step by step. https://phoenixnap.com/kb/install-spark-on-windows-10