I think these failed task must got retried automatically if you can't see any error in your results. Other wise the entire application will throw a SparkException and abort.
Unfortunately I don't know how to do this, my application always abort. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Occasional-failed-tasks-tp527p7259.html Sent from the Apache Spark User List mailing list archive at Nabble.com.