1
votes

I am a newbie to spark / scala ... I have set up on a fully distributed cluster spark / scala and sbt. when I test and issue the command pyspark I get the following error:

/home/hadoop/spark/bin/spark-class line 75 /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java - no such file or directory

my bashrc contains:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64 

hadoop-env.sh contains export JAVA_HOME=/usr/lib/jvm/java7-openjdk-amd64/jre/ 
conf/spark-env.sh contains JAVA_HOME=usr/lib/jvm/java7-openjdk-amd64/jre 
spark-class contains 
if [ -n "${JAVA_HOME"}]; then RUNNER="${JAVA_HOME}/bin/java" else if [ "$( command -v java ) " ] then RUNNER = "java"

can someone assist in what I need to change to get the right path to java

1
Did you get spark working on a single node without clustering?J'e
no this is the first time setup on a fully distributed nodeDerez

1 Answers

0
votes

I found the issue matched all paths from .bashrc to etc to spark.conf