Github user Ngone51 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21221#discussion_r187823298
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
    @@ -169,6 +179,27 @@ private[spark] class EventLoggingListener(
     
       // Events that trigger a flush
       override def onStageCompleted(event: SparkListenerStageCompleted): Unit 
= {
    +    // log the peak executor metrics for the stage, for each executor
    +    val accumUpdates = new ArrayBuffer[(Long, Int, Int, 
Seq[AccumulableInfo])]()
    +    val executorMap = liveStageExecutorMetrics.remove(
    +      (event.stageInfo.stageId, event.stageInfo.attemptNumber()))
    +    executorMap.foreach {
    +      executorEntry => {
    +        for ((executorId, peakExecutorMetrics) <- executorEntry) {
    --- End diff --
    
    I revisited the code, I think you're right. My mistake, sorry.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to