2
votes

The information from post Cannot start spark-shell is not help

The next enviroment is set:

java -version java version "9.0.1" Java(TM) SE Runtime Environment (build 9.0.1+11) Java HotSpot(TM) 64-Bit Server VM (build 9.0.1+11, mixed mode)

And Oracle Java 8, OpenJDK 8 has arror too

Scala code runner version 2.12.4 and version 2.10 error too

The spark binary file spark-2.2.1-bin-hadoop2.6 from apache.org The Hadoop version is 2.6

JAVA_HOME=/usr/lib/jvm/java-9-oracle
env | grep spark
SPARK_HOME=/usr/local/spark
env | grep scala
SCALA_HOME=/usr/local/scala
env | grep hadoop
HADOOP_HOME=/usr/local/hadoop

PATH=/usr/lib/jvm/java-9-oracle/bin:
/usr/lib/jvm/java-9-oracle/db/bin:
/usr/local/scala/bin:/usr/local/spark/bin:
/usr/local/scala/bin

SPARK_DIST_CLASSPATH="$HADOOP_HOME/etc/hadoop/*:
$HADOOP_HOME/share/hadoop/common/lib/*:
$HADOOP_HOME/share/hadoop/common/*:$HADOOP_HOME/share/hadoop/hdfs/*:
$HADOOP_HOME/share/hadoop/hdfs/lib/*:$HADOOP_HOME/share/hadoop/hdfs/*:
$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/yarn/*:
$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/*:
$HADOOP_HOME/share/hadoop/tools/lib/*"    

Run spark:

spark-shell

Output:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Setting default log level to "WARN".

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Exception in thread "main" java.lang.NullPointerException
at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)

What is reason???

2
Please, fix code blocks, make it clear what command you're executing and how the output looks like.Gildas

2 Answers

3
votes

Spark 2.2.x cannot run with Java 9 yet. Change your configuration to use Java 8 instead.

Set:

JAVA_HOME=/usr/lib/jvm/java-8-oracle
PATH="$JAVA_HOME/bin:"$PATH

Make sure Java 8 is the default version:

sudo update-alternatives --config java
sudo update-alternatives --config javac
sudo update-alternatives --config javah

If it all fails, uninstall Java 9.

sudo apt-get purge oracle-java9-installer
sudo add-apt-repository --remove ppa:webupd8team/java

I hope that helps.

0
votes

I have both java9 and java8 installed. Simply setting JAVA_HOME works for me.

I'm using env JAVA_HOME=/usr/lib/jvm/java-8-openjdk/jre/ ./spark-shell(in bash: JAVA_HOME=/usr/lib/jvm/java-8-openjdk/jre/ ./spark-shell)(Spark 2.2.1, scala 2.11.8)