I'm running spark in single node.
My application (java-web) is using less memory than available.. I found this thread as useful.
From the link
For local mode you only have one executor, and this executor is your driver, so you need to set the driver's memory instead. *That said, in local mode, by the time you run spark-submit, a JVM has already been launched with the default memory settings, so setting "spark.driver.memory" in your conf won't actually do anything for you. Instead, you need to run spark-submit as follows
bin/spark-submit --driver-memory 2g --class your.class.here app.jar
It suggests to use the memory-flag along with bin/spark-submit --for a jar file
But I'm running a maven-web applicaiton. Can I run this with spark-submit??
I set these in spark-env.sh and run source spark-env.sh but still no change
SPARK_EXECUTOR_MEMORY=10g
SPARK_WORKER_MEMORY=10g