Thomas Graves created SPARK-10781:
-------------------------------------

             Summary: Allow certain number of failed tasks and allow job to 
succeed
                 Key: SPARK-10781
                 URL: https://issues.apache.org/jira/browse/SPARK-10781
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 1.5.0
            Reporter: Thomas Graves


MapReduce has this config mapreduce.map.failures.maxpercent and 
mapreduce.reduce.failures.maxpercent which allows for a certain percent of 
tasks to fail but the job to still succeed.  

This could be a useful feature in Spark also if a job doesn't need all the 
tasks to be successful.





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to