1
votes

Im trying to get spark working on win10. When i try to run spark shell i get this error :

'Spark\spark-2.0.0-bin-hadoop2.7\bin..\jars""\ is not recognized as an internal or external command,operable program or batch file.

Failed to find Spark jars directory. You need to build Spark before running this program.

I am using a pre-built spark for hadoop 2.7 or later. I have installed java 8, eclipse neon, python 2.7, scala 2.11, gotten winutils for hadoop 2.7.1 And i still get this error.

When I donwloaded spark it comes in the tgz, when extracted there is another tzg inside, so i extracted it also and then I got all the bin folders and stuff. I need to access spark-shell. Can anyone help?

EDIT: Solution i ended up using:

1) Virtual box

2) Linux mint

3
Could you post the complete error details so we get to know the issue better?Bhavesh

3 Answers

3
votes

I got the same error while building Spark. You can move the extracted folder to C:\

Refer this: http://techgobi.blogspot.in/2016/08/configure-spark-on-windows-some-error.html

1
votes

You are probably giving the wrong folder path to Spark bin.

Just open the command prompt and change directory to the bin inside the spark folder.

Type spark-shell to check.

Refer: Spark on win 10

0
votes

"On Windows, I found that if it is installed in a directory that has a space in the path (C:\Program Files\Spark) the installation will fail. Move it to the root or another directory with no spaces." OR If you have installed Spark under “C:\Program Files (x86)..” replace 'Program Files (x86)' with Progra~2 in the PATH env variable and SPARK_HOME user variable.