1
votes

I used cache() and persist() on DataFrame througout my application. And I want to know do I need to invoke unpersist() on each dataframe that I have cached to free up all memory and disk which have been occupied when my program is ending? Will spark automatically clean up those occupation?
Thanks

1

1 Answers

3
votes

Once spark context is stopped, memory will get free by itself. If you want to free the memory before stopping the context, in that case you have to invoke unpersist.