Since starting using Spark 1.2, I've experienced an annoying issue with
failing apps that gets executed twice. I'm not talking about tasks inside a
job, that should be executed multiple times before failing the whole app.
I'm talking about the whole app, that seems to close the previous Spark
context, start a new, and rerun the app again.

This is annoying since it overwrite the log files as well and it becomes
hard to troubleshoot the failing app. Does anyone know how to turn this
"feature" off?

Thanks,
Anders

Reply via email to