Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17422#discussion_r165772194
  
    --- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
    @@ -429,15 +429,42 @@ private[spark] class Executor(
     
             case t: TaskKilledException =>
               logInfo(s"Executor killed $taskName (TID $taskId), reason: 
${t.reason}")
    +
    +          // Collect latest accumulator values to report back to the driver
    +          val accums: Seq[AccumulatorV2[_, _]] =
    +            if (task != null) {
    +              task.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStart)
    +              task.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
    +              task.collectAccumulatorUpdates(taskFailed = true)
    +            } else {
    +              Seq.empty
    +            }
    +          val accUpdates = accums.map(acc => acc.toInfo(Some(acc.value), 
None))
    --- End diff --
    
    this should be refactored, and not repeated 3 times.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to