kotlovs opened a new pull request #32283:
URL: https://github.com/apache/spark/pull/32283


   ### What changes were proposed in this pull request?
   Close SparkContext after the Main method has finished, to allow 
SparkApplication on K8S to complete.
   This is fixed version of [closed 
PR](https://github.com/apache/spark/pull/32081).
   
   ### Why are the changes needed?
   if I don't call the method sparkContext.stop() explicitly, then a Spark 
driver process doesn't terminate even after its Main method has been completed. 
This behaviour is different from spark on yarn, where the manual sparkContext 
stopping is not required. It looks like, the problem is in using non-daemon 
threads, which prevent the driver jvm process from terminating.
   So I have inserted code that closes sparkContext automatically.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   ### How was this patch tested?
   Manually on the production AWS EKS environment in my company.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to