Is it possible to restart the job from the last successful stage instead of
from the beginning?

For example, if your job has stages 0, 1 and 2 .. and stage 0 takes a long
time and is successful, but the job fails on stage 1, it would be useful to
be able to restart from the output of stage 0 instead of from the beginning.

Note that I am NOT talking about Spark Streaming, just Spark Core (and
DataFrames), not sure if the case would be different with Streaming.

Thanks.

Reply via email to