0
votes

Trying to install Spark and associated programs on Mac but receiving error messages when testing installation.

/Users/somedirectory/apachespark/spark-2.3.0-bin-hadoop2.7/bin/pyspark /Users/somedirectory/apachespark/spark-2.3.0-bin-hadoop2.7/bin/spark-class: line 71: /Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java: No such file or directory

from my bash_profile entry...

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home/

export SPARK_HOME=/Users/directory/apachespark/spark-2.3.0-bin-hadoop2.7

export SBT_HOME=/Users/directory/apachespark/sbt

export SCALA_HOME=/Users/directory/apachespark/scala-2.11.12

export PATH=$JAVA_HOME/bin:$SBT_HOME/bin:$SBT_HOME/lib:$SCALA_HOME/bin:$SCALA_HOME/lib:$PATH

export PATH=$JAVA_HOME/bin:$SPARK_HOME:$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH

export PYSPARK_PYTHON=python3

PATH="/Library/Frameworks/Python.framework/Versions/3.6/bin:${PATH}" export PATH

correction suggestions? Thanks.

1
does /Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java exist? according to the error message it does not. should JAVA_HOME maybe be /Library/Java/JavaVirtualMachines/jdk1.8.0_162/jdk/Contents/Home//bin/java instead?puhlen
You can avoid all this by just doing brew install apache-sparkConstantine
I tried the brew method and after restarting everything works. Thanks.Sean

1 Answers

1
votes

As shown in the reported error message:

/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home//bin/java: No such file or directory

your file path for the Java executable $JAVA_HOME/bin generates an extra / due to the trailing / in your JAVA_HOME:

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_162.jdk/Contents/Home/

Removing the trailing / in JAVA_HOME should fix the problem. Better yet, setting JAVA_HOME as shown below would automatically point to the active JDK version on Mac OSX:

export JAVA_HOME=$(/usr/libexec/java_home)