I am Facing below Error while Running my Spark Scala code using Spark-submit command.
ERROR cluster.YarnClusterScheduler: Lost executor 14 on XXXX: Container killed by YARN for exceeding memory limits. 55.6 GB of 55 GB physical memory used.
The Code of the Line Number it throws the error is below...
df.write.mode("overwrite").parquet("file")
I am Writing a Parquet file.... It was working till yesterday not sure from last run only it is throwing the error with same input file.
Thanks, Naveen