I just upgraded to Spark 2.0 from 1.4 and downloaded the ec2 directory from github.com/amplab/spark-ec2/tree/branch-2.0
To spin up some clusters I go to my ec2 directory and run these commands:
./spark-ec2 -k <keypair> -i <key-file> -s <num-slaves> launch <cluster-name>
./spark-ec2 -k <keypair> -i <key-file> login <cluster-name>
I have my clusters up and I'm logged into master but I don't know how to launch a pyspark notebook. With Spark 1.4 I'll run the command
IPYTHON_OPTS="notebook --ip=0.0.0.0" /root/spark/bin/pyspark --executor-memory 4G --driver-memory 4G &
and I have my notebook up and running fine but with Spark 2.0 there is no bin/pyspark directory. Can anyone help with this?