0
votes

I've installed spark-2.3.0-bin-hadoop2.7 on Ubuntu and I don’t think it has some problem with the java path. When I run "spark-submit --version" or "spark-shell" or "pyspark" I get the following error:

/usr/local/spark-2.3.0-bin-hadoop2.7/bin/spark-class: line 71: /usr/lib/jvm/java-8-openjdk-amd-64/jre/bin/java: No such file or directory

It seems "/bin/java" is problematic, but I'm not sure where to change the configuration. The spark-class file has the following lines:

if [ -n "${JAVA_HOME}" ]; then RUNNER="${JAVA_HOME}/bin/java

The /etc/environment is:

bash: /etc/environment: Permission denied

What I now have in gedit ~/.bashrc is:

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd-64/jre

export PATH=$PATH:JAVA_HOME/bin

This is the current java setup that I have:

root@ubuntu:~# update-alternatives --config java There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java Nothing to configure.

bashrc has the following:

export PATH=$PATH:/usr/share/scala-2.11.8/bin

export SPARK_HOME=/usr/local/spark-2.3.0-bin-hadoop2.7

export PATH=$PATH:$SPARK_HOME/bin

Suggest me:

  1. What files I need to change and
  2. how I need to change them?
1
did you set the JAVA_HOME in your $SPARK_HOME/conf/spark-env.sh file? - Steven Black

1 Answers

1
votes

Java Home

Your JAVA_HOME should be set to your JDK
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd-64/jre
should be
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd-64

Here is the Oracle doc on JAVA_HOME (which should apply to Open JDK as well) https://docs.oracle.com/cd/E19182-01/820-7851/inst_cli_jdk_javahome_t/

Spark Environmental Variables

The JAVA_HOME should also be set in the $SPARK_HOME/conf/spark-env.sh https://spark.apache.org/docs/latest/configuration.html#environment-variables

😊