Spark just adds a hook to the mechanism that Hadoop exposes. You can
do the same. You shouldn't use Spark's.

On Mon, May 13, 2019 at 6:11 PM Nasrulla Khan Haris
<nasrulla.k...@microsoft.com.invalid> wrote:
>
> HI All,
>
>
>
> I am trying to add shutdown hook, but looks like shutdown manager object 
> requires the package to be spark only, is there any other API that can help 
> me to do this ?
>
> https://github.com/apache/spark/blob/v2.4.0/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
>
>
>
> I can see that hiveserver2 in 
> https://github.com/apache/spark/commit/515708d5f33d5acdb4206c626192d1838f8e691f
>  uses ShutdownHookManager in different package. But It expects me to have the 
> package name “org.apache.spark”
>
>
>
> Thanks,
>
> Nasrulla
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to