Have a look at the StageInfo
<https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.scheduler.StageInfo>
class,
it has method stageFailed. You could make use of it. I don't understand the
point of restarting the entire application.

Thanks
Best Regards

On Tue, Jun 30, 2015 at 2:59 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> My job has multiple stages, each time a stage fails i have to restart the
> entire app.
> I understand Spark restarts failed tasks.
>
> However, Is there a way to restart a Spark app from failed stage ?
>
> --
> Deepak
>
>

Reply via email to