I don't think you should rely on a shutdown hook. Ideally you try to
stop it in the main exit path of your program, even in case of an
exception.

On Tue, May 19, 2015 at 7:59 AM, Dibyendu Bhattacharya
<dibyendu.bhattach...@gmail.com> wrote:
> You mean to say within Runtime.getRuntime().addShutdownHook I call
> ssc.stop(stopSparkContext  = true, stopGracefully  = true) ?
>
> This won't work anymore in 1.4.
>
> The SparkContext got stopped before Receiver processed all received blocks
> and I see below exception in logs. But if I add the Utils.addShutdownHook
> with the priority as I mentioned , then only graceful shutdown works . In
> that case shutdown-hook run in priority order.
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to