The information from post Cannot start spark-shell is not help
The next enviroment is set:
java -version java version "9.0.1" Java(TM) SE Runtime Environment (build 9.0.1+11) Java HotSpot(TM) 64-Bit Server VM (build 9.0.1+11, mixed mode)
And Oracle Java 8, OpenJDK 8 has arror too
Scala code runner version 2.12.4 and version 2.10 error too
The spark binary file spark-2.2.1-bin-hadoop2.6 from apache.org The Hadoop version is 2.6
JAVA_HOME=/usr/lib/jvm/java-9-oracle
env | grep spark
SPARK_HOME=/usr/local/spark
env | grep scala
SCALA_HOME=/usr/local/scala
env | grep hadoop
HADOOP_HOME=/usr/local/hadoop
PATH=/usr/lib/jvm/java-9-oracle/bin:
/usr/lib/jvm/java-9-oracle/db/bin:
/usr/local/scala/bin:/usr/local/spark/bin:
/usr/local/scala/bin
SPARK_DIST_CLASSPATH="$HADOOP_HOME/etc/hadoop/*:
$HADOOP_HOME/share/hadoop/common/lib/*:
$HADOOP_HOME/share/hadoop/common/*:$HADOOP_HOME/share/hadoop/hdfs/*:
$HADOOP_HOME/share/hadoop/hdfs/lib/*:$HADOOP_HOME/share/hadoop/hdfs/*:
$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/yarn/*:
$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/*:
$HADOOP_HOME/share/hadoop/tools/lib/*"
Run spark:
spark-shell
Output:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Setting default log level to "WARN".
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)
What is reason???