I am new to Spark and Python. I have installed python 3.5.1 and Spark-1.6.0-bin-hadoop2.4 on windows.
I am getting below error when I execute sc = SparkContext("local", "Simple App") from python shell..
Can you please help?
from pyspark import SparkConf, SparkContext
sc = SparkContext("local", "Simple App")
Traceback (most recent call last):
File "", line 1, in
sc = SparkContext("local", "Simple App")
File "C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\context.py", line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File "C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\context.py", line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\spark-1.6.0-bin-hadoop2.4\python\pyspark\java_gateway.py", line 79, in launch_gateway
proc = Popen(command, stdin=PIPE, env=env)
File "C:\Python35-32\lib\subprocess.py", line 950, in init restore_signals, start_new_session)
File "C:\Python35-32\lib\subprocess.py", line 1220, in _execute_child startupinfo)
FileNotFoundError: [WinError 2] The system cannot find the file specified
sc = SparkContext("local", "Simple\ App")
? – Hossein Vatani