[ 
https://issues.apache.org/jira/browse/SPARK-20843?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16026663#comment-16026663
 ] 

Michael Allman commented on SPARK-20843:
----------------------------------------

bq. I will say that I don't think its really safe to rely on clean shutdowns 
for correctness...

I agree, but we don't want a "force kill" to become the norm for an app 
shutdown. An unclean shutdown should be an exceptional situation that will be 
cause for an ops alert, an investigation and a validation that no data loss or 
corruption occurred (i.e. our failure mechanisms held).

Also, I would say that in some cases where an app integrates with other systems 
and cannot guarantee transactional or idempotent semantics in the event of a 
failure, an unclean shutdown *will* require cross-system validation and any 
necessary data synchronization or recovery.

Cheers.

> Cannot gracefully kill drivers which take longer than 10 seconds to die
> -----------------------------------------------------------------------
>
>                 Key: SPARK-20843
>                 URL: https://issues.apache.org/jira/browse/SPARK-20843
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Michael Allman
>              Labels: regression
>
> Commit 
> https://github.com/apache/spark/commit/1c9a386c6b6812a3931f3fb0004249894a01f657
>  changed the behavior of driver process termination. Whereas before 
> `Process.destroyForcibly` was never called, now it is called (on Java VM's 
> supporting that API) if the driver process does not die within 10 seconds.
> This prevents apps which take longer than 10 seconds to shutdown gracefully 
> from shutting down gracefully. For example, streaming apps with a large batch 
> duration (say, 30 seconds+) can take minutes to shutdown.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to