EDIT: I think it's because I have Java 10 installed instead of Java 8, I'll remedy that soon and see if it fixes the problems. Also, I still get this error with Java 8 if I use Git Bash, but not if I use CMD.
I've been trying to run a Spark program locally outside of IntelliJ (where it works fine). When I run it in my terminal using spark-submit, though, all that happens is that a more verbose version of the command is printed.
(Running from directory where Main class is located (although I've also tried being verbose about its directory), but even if I run it without specifying a class, nothing happens).
e.g.
> spark-submit --class Main --master local[4] path-to-jar.jar
output:
"C:\Program Files\Java\jdk-10.0.1\bin\java" -cp "C:\...\spark-2.3.1-bin-hadoop2.7/conf\;C:\...\spark-2.3.1-bin-hadoop2.7\jars\*" -Xmx1g org.apache.spark.deploy.SparkSubmit --class Main " --master" local[4] " path-to-jar.jar"
where the ellipses are just parts of paths.
I can't find any reason for this error--I'm following the Spark docs. I'm not very familiar with Spark, though. Any ideas?
Also something that might be related to this problem: When I run just "spark-shell" in cmd, it also just prints the path to spark-shell. If I run "spark-shell.cmd" I get errors that say
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
among other things.
--class com.yourdomain.yourpackage.Main. Perhaps that would work? - Travis Hegner