Hi,

When ApplicationMaster runs it registers a shutdown hook [1] that
(quoting the comment [2] from the code):

> // This shutdown hook should run *after* the SparkContext is shut down.

And so it gets priority lower than SparkContext [3], i.e.

val priority = ShutdownHookManager.SPARK_CONTEXT_SHUTDOWN_PRIORITY - 1

But, reading ShutdownHookManager.addShutdownHook says [4]:

> Adds a shutdown hook with the given priority. Hooks with lower priority 
> values run first.

My understanding is that one comment is no longer true (if it has ever been).

Please help me understand that part of the code. Thanks.

[1] 
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L206
[2] 
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L204
[3] 
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L205
[4] 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala#L146-L147

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to