If you are talking about handling driver crash failures, then all bets are
off anyways! Adding a shutdown hook in the hope of handling driver process
failure, handles only a some cases (Ctrl-C), but does not handle cases like
SIGKILL (does not run JVM shutdown hooks) or driver machine crash. So its
not a good idea to rely on that.

Nonetheless I have opened a PR to handle the shutdown of the
StreamigntContext in the same way as SparkContext.
https://github.com/apache/spark/pull/6307


On Tue, May 19, 2015 at 12:51 AM, Dibyendu Bhattacharya <
dibyendu.bhattach...@gmail.com> wrote:

> Thenka Sean . you are right. If driver program is running then I can
> handle shutdown in main exit path  . But if Driver machine is crashed (if
> you just stop the application, for example killing the driver process ),
> then Shutdownhook is the only option isn't it ? What I try to say is , just
> doing ssc.stop in  sys.ShutdownHookThread  or
>  Runtime.getRuntime().addShutdownHook ( in java) wont work anymore. I need
> to use the Utils.addShutdownHook with a priority .. So just checking if
> Spark Streaming can make graceful shutdown as default shutdown mechanism.
>
> Dibyendu
>
> On Tue, May 19, 2015 at 1:03 PM, Sean Owen <so...@cloudera.com> wrote:
>
>> I don't think you should rely on a shutdown hook. Ideally you try to
>> stop it in the main exit path of your program, even in case of an
>> exception.
>>
>> On Tue, May 19, 2015 at 7:59 AM, Dibyendu Bhattacharya
>> <dibyendu.bhattach...@gmail.com> wrote:
>> > You mean to say within Runtime.getRuntime().addShutdownHook I call
>> > ssc.stop(stopSparkContext  = true, stopGracefully  = true) ?
>> >
>> > This won't work anymore in 1.4.
>> >
>> > The SparkContext got stopped before Receiver processed all received
>> blocks
>> > and I see below exception in logs. But if I add the
>> Utils.addShutdownHook
>> > with the priority as I mentioned , then only graceful shutdown works .
>> In
>> > that case shutdown-hook run in priority order.
>> >
>>
>
>

Reply via email to