Hi. I am using Spark for querying Hive followed by transformations. My
Scala app creates multiple Spark Applications. A new spark context (and
session) is created only after closing previous SparkSession and Spark
Context.

However, on stopping sc and spark, somehow connections to Hive Metastore
(Mysql) are not destroyed properly. For every, Spark App I can see around 5
Mysql connections being created (old connections being still active!).
Eventually, Mysql starts rejecting new connections after 150 open
connections. How can I force spark to close Hive metastore connections to
Mysql (after spark.stop() and sc.stop())?

sc = spark context
spark = sparksession


Regards,
Rohit S Damkondwar

Reply via email to