I used cache() and persist() on DataFrame througout my application. And I want to know do I need to invoke unpersist() on each dataframe that I have cached to free up all memory and disk which have been occupied when my program is ending? Will spark automatically clean up those occupation?
Thanks
1
votes