I'm trying to get a clear idea about how exceptions are handled in Spark?
Is there somewhere where I can read about this? I'm on spark .7

For some reason I was under the impression that such exceptions are
swallowed and the value that produced them ignored but the exception is
logged. However, right now we're seeing the task just re-tried over and
over again in an infinite loop because there's a value that always
generates an exception.

John

Reply via email to