Yes. I mean killing the Spark Job from UI. Also I use
context.awaitTermination().
On Wed, Nov 25, 2015 at 6:23 PM, Tathagata Das wrote:
> What do you mean by killing the streaming job using UI? Do you mean that
> you are clicking the "kill" link in the Jobs page in the
What do you mean by killing the streaming job using UI? Do you mean that
you are clicking the "kill" link in the Jobs page in the Spark UI?
Also in the application, is the main thread waiting on
streamingContext.awaitTermination()? That is designed to catch exceptions
in running job and throw it
Testet with Spark 1.5.2 Works perfect when exit code is non-zero.
And does not Restart with exit code equals zero.
Von: Prem Sure
Datum: Mittwoch, 25. November 2015 19:57
An: SRK
Cc:
Betreff: Re: Automatic driver
Hi,
I am submitting my Spark job with supervise option as shown below. When I
kill the driver and the app from UI, the driver does not restart
automatically. This is in a cluster mode. Any suggestion on how to make
Automatic Driver Restart work would be of great help.
--supervise
Thanks,
I think automatic driver restart will happen, if driver fails with non-zero
exit code.
--deploy-mode cluster
--supervise
On Wed, Nov 25, 2015 at 1:46 PM, SRK wrote:
> Hi,
>
> I am submitting my Spark job with supervise option as shown below. When I
> kill the
I am killing my Streaming job using UI. What error code does UI provide if
the job is killed from there?
On Wed, Nov 25, 2015 at 11:01 AM, Kay-Uwe Moosheimer
wrote:
> Testet with Spark 1.5.2 … Works perfect when exit code is non-zero.
> And does not Restart with exit code