[ https://issues.apache.org/jira/browse/SPARK-27630?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17046245#comment-17046245 ]
Xiao Li commented on SPARK-27630: --------------------------------- This change breaks the API {code:java} ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.SparkListenerSpeculativeTaskSubmitted.apply"), ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.SparkListenerSpeculativeTaskSubmitted.copy"), ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.SparkListenerSpeculativeTaskSubmitted.this"), ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.scheduler.SparkListenerSpeculativeTaskSubmitted$"),{code} > Stage retry causes totalRunningTasks calculation to be negative > --------------------------------------------------------------- > > Key: SPARK-27630 > URL: https://issues.apache.org/jira/browse/SPARK-27630 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.3.0 > Reporter: dzcxzl > Assignee: dzcxzl > Priority: Minor > Labels: release-notes > Fix For: 3.0.0 > > > Track tasks separately for each stage attempt (instead of tracking by stage), > and do NOT reset the numRunningTasks to 0 on StageCompleted. > In the case of stage retry, the {{taskEnd}} event from the zombie stage > sometimes makes the number of {{totalRunningTasks}} negative, which will > causes the job to get stuck. > Similar problem also exists with {{stageIdToTaskIndices}} & > {{stageIdToSpeculativeTaskIndices}}. > If it is a failed {{taskEnd}} event of the zombie stage, this will cause > {{stageIdToTaskIndices}} or {{stageIdToSpeculativeTaskIndices}} to remove the > task index of the active stage, and the number of {{totalPendingTasks}} will > increase unexpectedly. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org