[ https://issues.apache.org/jira/browse/SPARK-13343?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16525898#comment-16525898 ]
Apache Spark commented on SPARK-13343: -------------------------------------- User 'hthuynh2' has created a pull request for this issue: https://github.com/apache/spark/pull/21653 > speculative tasks that didn't commit shouldn't be marked as success > ------------------------------------------------------------------- > > Key: SPARK-13343 > URL: https://issues.apache.org/jira/browse/SPARK-13343 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.6.0 > Reporter: Thomas Graves > Priority: Major > > Currently Speculative tasks that didn't commit can show up as success of > failures (depending on timing of commit). This is a bit confusing because > that task didn't really succeed in the sense it didn't write anything. > I think these tasks should be marked as KILLED or something that is more > obvious to the user exactly what happened. it is happened to hit the timing > where it got a commit denied exception then it shows up as failed and counts > against your task failures. It shouldn't count against task failures since > that failure really doesn't matter. > MapReduce handles these situation so perhaps we can look there for a model. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org