0
votes

I am very new to Spark and need your help.I want to run spark 2.0.2 (Hadoop 2.7)on Windows 8. I have defined system variables and values as:

Java_Home    C:\Progra~1\Spark Ecosystem\JDK\jdk1.8.0_111.
Hadoop_Home  C:\Program Files\Spark ecosystem\winutils.
Spark_Home   C:\Program Files\Spark ecosystem\Spark\bin.
Path          %Java_Home%\bin; %Hadoop_Home%; %Spark_Home%;  

I have installed Eclipse, its exe file is working. Java -version is working ok but spark-shell command is not recognized...

1

1 Answers

0
votes

You must not override the path. Just append to it.

cmd is the Windows "shell", located in %WINDIR%\system32. You need to leave it in the path (it like if you removed the path to /bin on linux: not much would work after that)

for instance do this (append original value of PATH in your startup script):

set PATH=%Java_Home%\bin;%Hadoop_Home%;%Spark_Home%;%PATH%

minimalist version:

set PATH=%WINDIR%\system32;%Java_Home%\bin;%Hadoop_Home%;%Spark_Home%