wypoon commented on a change in pull request #29020:
URL: https://github.com/apache/spark/pull/29020#discussion_r460350116



##########
File path: core/src/main/scala/org/apache/spark/status/AppStatusListener.scala
##########
@@ -868,13 +868,17 @@ private[spark] class AppStatusListener(
     // check if there is a new peak value for any of the executor level memory 
metrics
     // for the live UI. SparkListenerExecutorMetricsUpdate events are only 
processed
     // for the live UI.
-    event.executorUpdates.foreach { case (_, peakUpdates) =>
+    event.executorUpdates.foreach { case (key, peakUpdates) =>
       liveExecutors.get(event.execId).foreach { exec =>
         if (exec.peakExecutorMetrics.compareAndUpdatePeakValues(peakUpdates)) {
           maybeUpdate(exec, now)

Review comment:
       I am not so familiar with `AppStatusListener`, so please educate me. 
From what I see, for most events, `maybeUpdate` is called. I assume that we 
only want to update a live application if it hasn't been updated in the last 
configured period. The `SparkListenerExecutorMetricsUpdate` event here is 
processed for a live application. For the `SparkListenerStageExecutorMetrics` 
event, according to Edwina's comment, that is only processed when reading event 
logs, so `update` is called in that case.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to