Not clear what you mean here. A Spark program is a program, so what are the
alternatives here? program execution order is still program execution
order. You are not guaranteed anything about order of concurrent tasks.
Failed tasks can be reexecuted so should be idempotent. I think the answer
is 'no' but not sure what you are thinking of here.

On Mon, Jan 24, 2022 at 7:10 AM sam smith <qustacksm2123...@gmail.com>
wrote:

> Hello guys,
>
> I hope my question does not sound weird, but could a Spark execution on
> Hadoop cluster give different output than the program actually does ? I
> mean by that, the execution order is messed by hadoop, or an instruction
> executed twice..; ?
>
> Thanks for your enlightenment
>

Reply via email to