Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21221#discussion_r207037118
  
    --- Diff: 
core/src/main/scala/org/apache/spark/status/AppStatusListener.scala ---
    @@ -669,6 +686,34 @@ private[spark] class AppStatusListener(
             }
           }
         }
    +
    +    // check if there is a new peak value for any of the executor level 
memory metrics
    +    // for the live UI. SparkListenerExecutorMetricsUpdate events are only 
processed
    +    // for the live UI.
    +    event.executorUpdates.foreach { updates: ExecutorMetrics =>
    +      liveExecutors.get(event.execId).foreach { exec: LiveExecutor =>
    +        if (exec.peakExecutorMetrics.compareAndUpdatePeakValues(updates)) {
    +          maybeUpdate(exec, now)
    +        }
    +      }
    +    }
    +  }
    +
    +  override def onStageExecutorMetrics(executorMetrics: 
SparkListenerStageExecutorMetrics): Unit = {
    +    val now = System.nanoTime()
    +
    +    // check if there is a new peak value for any of the executor level 
memory metrics,
    +    // while reading from the log. SparkListenerStageExecutorMetrics are 
only processed
    +    // when reading logs.
    +    liveExecutors.get(executorMetrics.execId)
    +      .orElse(deadExecutors.get(executorMetrics.execId)) match {
    +      case Some(exec) =>
    --- End diff --
    
    yeah, but you're talking about both a `foreach` *and* an `if` together.
    
    A long time back we discussed using `option.fold` for this, as it is all in 
one function, but we rejected it as being pretty confusing for most developers.
    
    ```scala
    scala> def foo(x: Option[String]) = x.fold("nada")("some " + _)
    foo: (x: Option[String])String
    
    scala> foo(None)
    res0: String = nada
    
    scala> foo(Some("blah"))
    res1: String = some blah
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to