[ https://issues.apache.org/jira/browse/SPARK-42219?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17681463#comment-17681463 ]
Apache Spark commented on SPARK-42219: -------------------------------------- User 'attilapiros' has created a pull request for this issue: https://github.com/apache/spark/pull/39775 > Make "SPARK-34674: Close SparkContext after the Main method has finished" > configurable > -------------------------------------------------------------------------------------- > > Key: SPARK-42219 > URL: https://issues.apache.org/jira/browse/SPARK-42219 > Project: Spark > Issue Type: Bug > Components: Kubernetes > Affects Versions: 3.1.2, 3.2.0, 3.1.3, 3.2.1, 3.3.0, 3.2.2, 3.3.1, 3.2.3 > Reporter: Attila Zsolt Piros > Assignee: Attila Zsolt Piros > Priority: Major > > We run into an error after an upgrade from Spark 3.1 to Spark 3.2 cased by > SPARK-34674 which closed the SparkContext right after the application start. > This application was a spark job server built on top of springboot so all the > job submits were outside of the main method. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org