0
votes

I have installed Java-11-openjdk-amd64 and ran it in auto in usr/lib/jvm/Java-11-openjdk-amd64/bin/java, and Scala 2.11.12 and spark 2.2.0 with hadoop 2.7 in my desktop having linux mint VM 19.2 running on windows 10. I am getting error opening spark-shell:

Failed to initialize compiler: object java.lang.Object in compiler mirror not found. 

I also defined the variables in the .bashrc file in the home directory as follow:

export JAVA_HOME=/usr/lib/jvm/Java-11-openjdk-amd64
export SCALA_HOME=/usr/local/src/scala/scala-2.11.12
export SPARK_HOME=/usr/lib/spark/spark-2.2.0-bin-hadoop2.7
export PATH=$SCALA_HOME/bin:$JAVA_HOME/bin:$SPARK_HOME/bin:$PATH

How can I solve this? Do I have to change it into java 8 to run spark? I am trying to run it on java-8-openjdk-amd64 using update-alternatives --config java but I cannot change the selected java as it gives me another error: permission denied.

How can I move my java 8 to another folder using command line as I cannot do it manually? I am new to Linux and Spark..

1
yes, you'd better use Java8. could you paste more of the permission denied error log. - DennisLi
this is the permission denied error: mint@mint:~$ update-alternatives --config java There are 2 choices for the alternative java (providing /usr/bin/java). update-alternatives: using /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java to provide /usr/bin/java (java) in manual mode update-alternatives: error: error creating symbolic link '/etc/alternatives/java.dpkg-tmp': Permission denied - k_bm

1 Answers

0
votes

You should use Java 8 since Spark depends heavily on some of the Java 8 features that where either made private, deprecated or removed in Java 9 and above.

Copy: https://www.webservertalk.com/copy-directory-folder-linux-cmd