9
votes

I've installed Spark 2.1.1 on Ubuntu and no matter what I do, it doesn't seem to agree with the java path. When I run "spark-submit --version" or "spark-shell" I get the following error:

/usr/local/spark/bin/spark-class: line 71: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin//bin/java: No such file or directory

Now obviously the "/bin//bin/java" is problematic, but I'm not sure where to change the configuration. The spark-class file has the following lines:

if [ -n "${JAVA_HOME}" ]; then
  RUNNER="${JAVA_HOME}/bin/java"

I was originally using a version of Spark meant for Hadoop 2.4 and when I changed it to "RUNNER="${JAVA_HOME}" it would either give me the error "[path] is a directory" or "[path] is not a directory." This was after also trying multiple path permutations in /etc/environment

What I now have in /etc/environment is:

JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/"

This is the current java setup that I have:

root@ubuntu:~# update-alternatives --config java There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java

bashrc has the following:

export SPARK_HOME="/usr/local/spark"
export PATH="$PATH:$SPARK_HOME/bin"

Can anyone advise: 1) What files I need to change and 2) how I need to change them? Thanks in advance.

spark-class file is in the link, just in case:

http://vaughn-s.net/hadoop/spark-class

2

2 Answers

17
votes

In the /etc/environment file replace

JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/ 

with

JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64/jre/

then execute

source /etc/environment 

also RUNNER="${JAVA_HOME}/bin/java" should be kept as it is

1
votes

Windows Environment:

Open Advanced system settings -> Environment Variables to set JAVA_HOME path, and the most common mistake is setting the path to JAVA folder:

JAVA_HOME: Directory-Name:\java

rather than setting it to JDK folder

JAVA_HOME: Directory-Name:\jdk

This is how it worked for me.