[ https://issues.apache.org/jira/browse/SPARK-13343?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Thomas Graves resolved SPARK-13343. ----------------------------------- Resolution: Fixed Assignee: Hieu Tri Huynh Fix Version/s: 2.4.0 > speculative tasks that didn't commit shouldn't be marked as success > ------------------------------------------------------------------- > > Key: SPARK-13343 > URL: https://issues.apache.org/jira/browse/SPARK-13343 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.6.0 > Reporter: Thomas Graves > Assignee: Hieu Tri Huynh > Priority: Major > Fix For: 2.4.0 > > Attachments: Screen Shot 2018-07-08 at 3.49.52 PM.png, image.png, > image.png > > > Currently Speculative tasks that didn't commit can show up as success > (depending on timing of commit). This is a bit confusing because that task > didn't really succeed in the sense it didn't write anything. > I think these tasks should be marked as KILLED or something that is more > obvious to the user exactly what happened. it is happened to hit the timing > where it got a commit denied exception then it shows up as failed and counts > against your task failures. It shouldn't count against task failures since > that failure really doesn't matter. > MapReduce handles these situation so perhaps we can look there for a model. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org