[ https://issues.apache.org/jira/browse/SPARK-34674?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17303313#comment-17303313 ]
Sergey commented on SPARK-34674: -------------------------------- Sorry, I didn't search sufficiently for existing issues. There is already the issue https://issues.apache.org/jira/browse/SPARK-27812, and it is marked as Fixed. But it looks like, I have this bug in Spark 3.1.1. I just start an spark app on Amazon EKS (Kubernetes version - 1.17) by _spark-on-k8s-operator: v1beta2-1.2.0-3.0.0_ [(https://github.com/GoogleCloudPlatform/spark-on-k8s-operator)|https://github.com/GoogleCloudPlatform/spark-on-k8s-operator] Spark docker image is built from the official release of spark-3.1.1 hadoop3.2. > Spark app on k8s doesn't terminate without call to sparkContext.stop() method > ----------------------------------------------------------------------------- > > Key: SPARK-34674 > URL: https://issues.apache.org/jira/browse/SPARK-34674 > Project: Spark > Issue Type: Bug > Components: Kubernetes > Affects Versions: 3.1.1 > Reporter: Sergey > Priority: Major > > Hello! > I have run into a problem that if I don't call the method > sparkContext.stop() explicitly, then a Spark driver process doesn't terminate > even after its Main method has been completed. This behaviour is different > from spark on yarn, where the manual sparkContext stopping is not required. > It looks like, the problem is in using non-daemon threads, which prevent the > driver jvm process from terminating. > At least I see two non-daemon threads, if I don't call sparkContext.stop(): > {code:java} > Thread[OkHttp kubernetes.default.svc,5,main] > Thread[OkHttp kubernetes.default.svc Writer,5,main] > {code} > Could you tell please, if it is possible to solve this problem? > Docker image from the official release of spark-3.1.1 hadoop3.2 is used. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org