0
votes

I have downloaded : spark-2.2.0-bin-hadoop2.7

In my ~/.bash_profile I have following:

export PATH="/Users/spandan.chakraborty/anaconda/bin:$PATH"

export SPARK_PATH=~/spark-2.2.0-bin-hadoop2.7

export PYSPARK_DRIVER_PYTHON='jupyter'

export PYSPARK_DRIVER_PYTHON_OPTS='notebook'

alias snote='$SPARK_PATH/bin/pyspark --master local[2]'

But when ever I am trying to Launch Jupyter notebook with the command alias : snote I am getting following error:

Error in pyspark startup:

IPYTHON and IPYTHON_OPTS are removed in Spark 2.0+. Remove these from the environment and set PYSPARK_DRIVER_PYTHON and PYSPARK_DRIVER_PYTHON_OPTS instead.

I have referred to similar issue as posted in Starting Ipython with Spark 2

but with no avail.

2

2 Answers

1
votes

This worked for me, (I am running on my MAC)

PYSPARK_DRIVER_PYTHON=ipython PYSPAK_DRIVER_PYTHON_OPTS="notebook" /Users/abhishekdutta/Downloads/Spark/spark-2.2.0-bin-hadoop2.7/bin/pyspark

0
votes

Following has solved the Issue:

export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS='notebook'