How can I increase the memory available for Apache spark executor nodes?
I have a 2 GB file that is suitable to loading in to Apache Spark. I am running apache spark for the moment on 1 machine, so the driver and executor are on the same machine. The machine has 8 GB of memory.
When I try count the lines of the file after setting the file to be cached in memory I get these errors:
2014-10-25 22:25:12 WARN CacheManager:71 - Not enough space to cache partition rdd_1_1 in memory! Free memory is 278099801 bytes.
I looked at the documentation here and set spark.executor.memory
to 4g
in $SPARK_HOME/conf/spark-defaults.conf
The UI shows this variable is set in the Spark Environment. You can find screenshot here
However when I go to the Executor tab the memory limit for my single Executor is still set to 265.4 MB. I also still get the same error.
I tried various things mentioned here but I still get the error and don't have a clear idea where I should change the setting.
I am running my code interactively from the spark-shell