I have installed Java-11-openjdk-amd64 and ran it in auto in usr/lib/jvm/Java-11-openjdk-amd64/bin/java, and Scala 2.11.12 and spark 2.2.0 with hadoop 2.7 in my desktop having linux mint VM 19.2 running on windows 10. I am getting error opening spark-shell:
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
I also defined the variables in the .bashrc file in the home directory as follow:
export JAVA_HOME=/usr/lib/jvm/Java-11-openjdk-amd64
export SCALA_HOME=/usr/local/src/scala/scala-2.11.12
export SPARK_HOME=/usr/lib/spark/spark-2.2.0-bin-hadoop2.7
export PATH=$SCALA_HOME/bin:$JAVA_HOME/bin:$SPARK_HOME/bin:$PATH
How can I solve this? Do I have to change it into java 8 to run spark? I am trying to run it on java-8-openjdk-amd64 using update-alternatives --config java but I cannot change the selected java as it gives me another error: permission denied.
How can I move my java 8 to another folder using command line as I cannot do it manually? I am new to Linux and Spark..