Github user markhamstra commented on the issue:

    https://github.com/apache/spark/pull/17113
  
    > Spark does immediately abort the stage but it doesn't kill the running 
tasks
    
    Whether running tasks are interrupted on stage abort or not depends on the 
state of a config boolean -- and ideally we'd like to get to the point where we 
can confidently set that config so that running tasks are interrupted when the 
associated job or stage dies.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to