1
votes

I'm running a spark job using spark-submit. At the end of the day, I'm also getting some output but the log file shows

WARN SparkEnv:87 - Exception while deleting Spark temp dir: ......

java.io.IOException: Failed to delete: ...

I've created a temp directory and have pointed spark.local.dir to this new path with the spark-submit command

  1. would this exception affect my output.
  2. I'll be using the same tmp directory while running other spark jobs. Will it affect them
  3. and is there anyway I can avoid this?
1

1 Answers

0
votes

This is expected on windows, you can ignore it and you can disable the warning using this in your log4j.properties file:

log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF log4j.logger.org.apache.spark.SparkEnv=ERROR

  1. No it won't affect your output
  2. It won't affect other output
  3. Either ignore the message using the log4j.properties or troubleshoot why the tmp directory can't be deleted.