I am using Spark for querying Hive followed by transformations. My Scala app creates multiple Spark Applications. A new spark app is created only after closing SparkSession and Spark Context of the previous Spark App.
However, on stopping sc and spark, somehow connections to Hive Metastore (Mysql) are not destroyed properly. For every, Spark App I can see around 5 Mysql connections being created (old connections being still active!). Eventually, Mysql starts rejecting new connections after 150 open connections. How can I force spark to close Hive metastore connections to Mysql (after spark.stop() and sc.stop())?
Note: I have used Spark 2.1.1. I am using Spark's Thriftserver instead of HiveServer. So, I don't think I have used Hive Metastore service.