[ 
https://issues.apache.org/jira/browse/SPARK-12049?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin resolved SPARK-12049.
------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.6.0
                   1.5.3

> User JVM shutdown hook can cause deadlock at shutdown
> -----------------------------------------------------
>
>                 Key: SPARK-12049
>                 URL: https://issues.apache.org/jira/browse/SPARK-12049
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.2, 1.6.0
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>             Fix For: 1.5.3, 1.6.0
>
>
> Here's a simplification of a deadlock that can occur a shutdown if the user 
> app has also installed a shutdown hook to clean up:
> - Spark Shutdown Hook thread runs
> - {{SparkShutdownHookManager.runAll()}} is invoked, locking 
> {{SparkShutdownHookManager}} as it is {{synchronized}}
> - A user shutdown hook thread runs
> - User hook tries to call, for example {{StreamingContext.stop()}}, which is 
> {{synchronized}} and locks it
> - User hook blocks when the {{StreamingContext}} tries to {{remove()}} the 
> Spark Streaming shutdown task, since it's {{synchronized}} per above
> - Spark Shutdown Hook tries to execute the Spark Streaming shutdown task, but 
> blocks on {{StreamingContext.stop()}}
> I think this is actually not that critical, since it requires a pretty 
> specific setup, and I think it can be worked around in many cases by 
> integrating with Hadoop's shutdown hook mechanism like Spark does so that 
> these happen serially.
> I also think it's solvable in the code by not locking 
> {{SparkShutdownHookManager}} in the 3 methods that are {{synchronized}} since 
> these are really only protecting {{hooks}}. {{runAll()}} shouldn't hold the 
> lock while executing hooks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to