1
votes

EDIT: I think it's because I have Java 10 installed instead of Java 8, I'll remedy that soon and see if it fixes the problems. Also, I still get this error with Java 8 if I use Git Bash, but not if I use CMD.

I've been trying to run a Spark program locally outside of IntelliJ (where it works fine). When I run it in my terminal using spark-submit, though, all that happens is that a more verbose version of the command is printed.

(Running from directory where Main class is located (although I've also tried being verbose about its directory), but even if I run it without specifying a class, nothing happens).

e.g.

    > spark-submit --class Main --master local[4] path-to-jar.jar

output:

    "C:\Program Files\Java\jdk-10.0.1\bin\java" -cp "C:\...\spark-2.3.1-bin-hadoop2.7/conf\;C:\...\spark-2.3.1-bin-hadoop2.7\jars\*" -Xmx1g org.apache.spark.deploy.SparkSubmit --class Main " --master" local[4] " path-to-jar.jar"

where the ellipses are just parts of paths.

I can't find any reason for this error--I'm following the Spark docs. I'm not very familiar with Spark, though. Any ideas?

Also something that might be related to this problem: When I run just "spark-shell" in cmd, it also just prints the path to spark-shell. If I run "spark-shell.cmd" I get errors that say

    Failed to initialize compiler: object java.lang.Object in compiler mirror not found.

among other things.

1
I've always submitted the fully qualified package name to the main class, i.e. --class com.yourdomain.yourpackage.Main. Perhaps that would work? - Travis Hegner

1 Answers

0
votes

Need to check few things: 1. Check if environment variables are set properly, variables like : - HADOOP_HOME, JAVA_HOME, SPARK_HOME, WinUtils.exe 2. If you are running Java 10, uninstall Java 10 and install Java version "1.8.0_144"