The application is part of complex eco-system, where we are tracking status of all jobs using Yarn Rest Api.
Now for some specific business scenario we need to mark Spark Job as failed, but I have landed in a Gotcha situation, because doesn't matter what I raise in spark job Error/Exception or System.exit(123) job gets marked as Finished in Yarn, with finalstatus as Succeeded.
Using spark-submit to fire the spark job using jar.
object Execute {
def main(args: Array[String]) {
val sc = new SparkContext(sparkConf)
if(businessException needs to be raised)
//What to do???
}
}
Things I have tried in spark job:
- throw new Error("Whatever")
- throw new Exception("Whatever")
- System.exit(123)
- sys.exit(123)
Hopefully someone can tell me how do I mark spark job as failed in yarn UI.