6
votes

I am trying to install Spark in Win7 and am getting an error.

These are my environment settings:

SPARK_HOME : C:\spark (this is where I have unzipped the spark files)

JAVA_HOME : C:\Program Files\Java\jdk1.7.0_71;

SCALA_HOME: C:\Program Files (x86)\scala

PATH : C:\Program Files\Java\jdk1.7.0_71\bin;C:\app\Sampad\product\11.2.0\dbhome_1\bin;C:\Python27;C:\Program Files\Python27\;C:\Program Files\Python27\Scripts;C:\Program Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS Client\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files\Java\jdk1.6.0_45\bin;C:\Program Files\nodejs\;C:\Program Files\Python27;C:\Anaconda;C:\Anaconda\Scripts;C:\HashiCorp\Vagrant\bin;C:\Program Files (x86)\scala\bin;C:\spark\bin;

HADOOP_HOME : C:\winutils; (set this after reading this blog

Please let me know what mistake I have made. Thanks in advance.

10
Please someone let me know what mistake I have done. I have already searched many sites and tried everything,but it is not working.user1548787
you did not state what error you are getting. What happens when you type 'spark-shell' at a command line?aquagremlin
Please check this answer stackoverflow.com/a/52831841/2516356Moustafa Mahmoud
I ran into the same issue, in general go through your PATH and verify that you're not having any of the SPARK, SCALA and JAVA paths set incorrectly - while following tutorials to set up the environment it's easy to add an extra "\" or "\bin" by mistake.Pio

10 Answers

3
votes

I installed java and spark in folder with out spaces and I am using windows 10.

In my case I added

JAVA_HOME=C:\Java\jdk1.8.0_191\bin

so it was trying to search for executables in "C:\Program Files\Java\jdk1.8.0_191\bin\bin"

make sure to add variable as

JAVA_HOME = C:\Java\jdk1.8.0_191

and In path environmental variable add

%JAVA_HOME%\bin

""same for SPARK_HOME""

It is working for me now !!!

2
votes

Try modifying the spark-shell2.cmd file first line from

@echo off

to

rem @echo off

For me it showed me that it was trying to load a file from c:\spark\bin\bin on the following line

"%SPARK_HOME%\bin\spark-submit2.cmd" --class org.apache.spark.repl.Main --name "Spark shell" %*

In my env %SPARK_HOME% was set to c:\spark\bin. My installation is at c:\spark\bin

So I set the %SPARK_HOME% to c:\Spark and added %SPARK_HOME%\bin to my PATH.

1
votes

Probably, it happens, because different java versions:

JAVA_HOME : C:\Program Files\Java\jdk1.7.0_71;

C:\Program Files\Java\jdk1.6.0_45\bin

Instead of "C:\Program Files\Java\jdk1.6.0_45\bin" use "%JAVA_HOME%\bin"

Spark: Trying to run spark-shell, but get 'cmd' is not recognized as an internal or

1
votes

you need to setup JAVA_HOME and path the first thing is you have to go

edit system Environment variable -> Environment variable -> under the User varibales for <user> 

add New

1. JAVA_HOME = C:\Program Files\Java\jdk1.8.0_191
In path 
add %JAVA_HOME%\bin

2. SPARK_HOME = C:\spark   //*spark is my folder where i have installed spark*
In path
add %SPARK_HOME%\bin

This will fix your problem

1
votes

For me everything that is specified above was correct still it was not working.

The reason I found is all the environment variables where added in "system variables" section so when I added them in "user variables" section it started working.

I added JAVA_HOME,SPARK_HOME, HADOOP_HOME and PATH in user variables.

0
votes

I had similar issue, I reinstalled java(new version) and corrected JAVA_HOME. This resolved the issue for me

0
votes

I had same issue on windows 10, make sure that only JAVA_HOME value is absolute path, rest everything is relative to %JAVA_HOME%

0
votes

One of the reasons is either JAVA_HOME or SPARK_HOME has a space in the path. In this case :

"SCALA_HOME: C:\Program Files (x86)\scala"

Here after "Program Files (x86)" has space in its path. Try to move files to location where there won't be space involved in the full path. In my case I had space in JAVA_HOME path.

0
votes

I had the same problem, but I solved it by correctly setting up JAVA_HOME environment variable. Basically, you need JAVA version 8 to run Spark. So you need to install JAVA 8 and properly set up the environment and the path just like the way you set up SPARK_HOME. For example variable name: JAVA_HOME, variable value: C:\JAVA. Then you go to the path and set up the path for java as: %JAVA_HOME%

-1
votes

I had the same issue, the solution is to restart the kernel and work on one notebook