What do you mean by killing the streaming job using UI? Do you mean that
you are clicking the "kill" link in the Jobs page in the Spark UI?

Also in the application, is the main thread waiting on
streamingContext.awaitTermination()? That is designed to catch exceptions
in running job and throw it in the main thread, so that the java program
exits with an exception and non-zero exit code.




On Wed, Nov 25, 2015 at 12:57 PM, swetha kasireddy <
swethakasire...@gmail.com> wrote:

> I am killing my Streaming job using UI. What error code does UI provide if
> the job is killed from there?
>
> On Wed, Nov 25, 2015 at 11:01 AM, Kay-Uwe Moosheimer <u...@moosheimer.com>
> wrote:
>
>> Testet with Spark 1.5.2 … Works perfect when exit code is non-zero.
>> And does not Restart with exit code equals zero.
>>
>>
>> Von: Prem Sure <premsure...@gmail.com>
>> Datum: Mittwoch, 25. November 2015 19:57
>> An: SRK <swethakasire...@gmail.com>
>> Cc: <user@spark.apache.org>
>> Betreff: Re: Automatic driver restart does not seem to be working in
>> Spark Standalone
>>
>> I think automatic driver restart will happen, if driver fails with
>> non-zero exit code.
>>
>>   --deploy-mode cluster
>>   --supervise
>>
>>
>>
>> On Wed, Nov 25, 2015 at 1:46 PM, SRK <swethakasire...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am submitting my Spark job with supervise option as shown below. When I
>>> kill the driver and the app from UI, the driver does not restart
>>> automatically. This is in a cluster mode.  Any suggestion on how to make
>>> Automatic Driver Restart work would be of great help.
>>>
>>> --supervise
>>>
>>>
>>> Thanks,
>>> Swetha
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Automatic-driver-restart-does-not-seem-to-be-working-in-Spark-Standalone-tp25478.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to