Hi,

While running a big spark job, in spark web ui I can see a tiny fraction of
failed tasks:

26630/536568 (15 failed)

Since all the tasks are the same, the failed tasks cannot be an application
error. Also, spark log doesn't have any errors.

- Does spark retry these tasks?
- Are these due to something like hardware failure? Is this rate of failed
task something normal, or are they supposed to be absolutely 0?

Reply via email to