I am new to pyspark. I installed Pyspark on my windows machine
I downloaded apache spark from Spark download url
I set HADOOP_HOME and SPARK_HOME in environment variables
my SPARK_HOME=C:\spark\spark-2.4.4-bin-hadoop2.7
my HADOOP_HOME=C:\spark\spark-2.4.4-bin-hadoop2.7
But when I enter pyspark on command prompt I am getting
The system cannot find the path specified.
Even if I am going to bin directory and executing pyspark it is throwing same exception
Not sure what I missed here.please help me here