I'm trying to increase memory allocation for my executors and drivers in Spark, but I have the strange feeling that Spark is ignoring my configurations.
I'm using the following commands:
spark-submit spark_consumer.py --driver-memory=10G --executor-memory=5G --conf spark.executor.extraJavaOptions='-XX:+UseParallelGC -XX:+PrintGCDetails -XX:+PrintGCTimeStamps'
My initialization code is
class SparkRawConsumer:
def __init__(self, filename):
self.sparkContext = SparkContext.getOrCreate()
self.sparkContext.setLogLevel("ERROR")
self.sqlContext = SQLContext(self.sparkContext)
Theoretically, I should see that my driver program has total available 10GB of memory. However, I see this in my Spark UI (where my memory available is less than 400MB):
Why is Spark ignoring the configurations I am passing in?