This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
     new e1e94ed  [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task 
killed before real start
e1e94ed is described below

commit e1e94ed4ef45ef81814f1b920bac0afa52ae06a2
Author: yi.wu <yi...@databricks.com>
AuthorDate: Mon Sep 21 23:20:18 2020 -0700

    [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before 
real start
    
    ### What changes were proposed in this pull request?
    
    Only calculate the executorRunTime when taskStartTime > 0. Otherwise, set 
executorRunTime to 0.
    
    ### Why are the changes needed?
    
    bug fix.
    
    It's possible that a task be killed (e.g., by another successful attempt) 
before it reaches "taskStartTime = System.currentTimeMillis()". In this case, 
taskStartTime is still 0 since it hasn't been really initialized. And we will 
get the wrong executorRunTime by calculating System.currentTimeMillis() - 
taskStartTime.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, users will see the correct executorRunTime.
    
    ### How was this patch tested?
    
    Pass existing tests.
    
    Closes #29832 from Ngone51/backport-spark-3289.
    
    Authored-by: yi.wu <yi...@databricks.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala 
b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index f7ff0b8..fe57b1c 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -337,7 +337,10 @@ private[spark] class Executor(
     private def collectAccumulatorsAndResetStatusOnFailure(taskStartTime: 
Long) = {
       // Report executor runtime and JVM gc time
       Option(task).foreach(t => {
-        t.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStartTime)
+        t.metrics.setExecutorRunTime(
+          // SPARK-32898: it's possible that a task is killed when 
taskStartTime has the initial
+          // value(=0) still. In this case, the executorRunTime should be 
considered as 0.
+          if (taskStartTime > 0) System.currentTimeMillis() - taskStartTime 
else 0)
         t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
       })
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to