[ https://issues.apache.org/jira/browse/SPARK-13343?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-13343: ------------------------------------ Assignee: Apache Spark > speculative tasks that didn't commit shouldn't be marked as success > ------------------------------------------------------------------- > > Key: SPARK-13343 > URL: https://issues.apache.org/jira/browse/SPARK-13343 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.6.0 > Reporter: Thomas Graves > Assignee: Apache Spark > > Currently Speculative tasks that didn't commit can show up as success of > failures (depending on timing of commit). This is a bit confusing because > that task didn't really succeed in the sense it didn't write anything. > I think these tasks should be marked as KILLED or something that is more > obvious to the user exactly what happened. it is happened to hit the timing > where it got a commit denied exception then it shows up as failed and counts > against your task failures. It shouldn't count against task failures since > that failure really doesn't matter. > MapReduce handles these situation so perhaps we can look there for a model. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org