Github user edwinalu commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21221#discussion_r195541136
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
    @@ -234,8 +272,18 @@ private[spark] class EventLoggingListener(
         }
       }
     
    -  // No-op because logging every update would be overkill
    -  override def onExecutorMetricsUpdate(event: 
SparkListenerExecutorMetricsUpdate): Unit = { }
    +  override def onExecutorMetricsUpdate(event: 
SparkListenerExecutorMetricsUpdate): Unit = {
    +    if (shouldLogExecutorMetricsUpdates) {
    +      // For the active stages, record any new peak values for the memory 
metrics for the executor
    +      event.executorUpdates.foreach { executorUpdates =>
    +        liveStageExecutorMetrics.values.foreach { peakExecutorMetrics =>
    +          val peakMetrics = peakExecutorMetrics.getOrElseUpdate(
    +            event.execId, new PeakExecutorMetrics())
    +          peakMetrics.compareAndUpdate(executorUpdates)
    --- End diff --
    
    What would be the right timestamp? Peaks for different metrics could have 
different timestamps.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to