[ 
https://issues.apache.org/jira/browse/SPARK-10911?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14993706#comment-14993706
 ] 

Sean Owen commented on SPARK-10911:
-----------------------------------

I think you're right that the context was a little different. The only issue is 
that it's the most abrupt shutdown possible, and someone might legitimately 
have non-daemon threads that want to shut down cleanly, and this would kill 
them instantly. Maybe that is or isn't a valid usage of Spark; the question is 
whether this is worth ensuring that you can't have hung executors due to bad 
user code. I don't have a clear view on that, but you could argue it's a user 
problem.

> Executors should System.exit on clean shutdown
> ----------------------------------------------
>
>                 Key: SPARK-10911
>                 URL: https://issues.apache.org/jira/browse/SPARK-10911
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.1
>            Reporter: Thomas Graves
>
> Executors should call System.exit on clean shutdown to make sure all user 
> threads exit and jvm shuts down.
> We ran into a case where an Executor was left around for days trying to 
> shutdown because the user code was using a non-daemon thread pool and one of 
> those threads wasn't exiting.  We should force the jvm to go away with 
> System.exit.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to