Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22612#discussion_r237977304
  
    --- Diff: 
core/src/main/scala/org/apache/spark/executor/ExecutorMetrics.scala ---
    @@ -49,14 +47,14 @@ class ExecutorMetrics private[spark] extends 
Serializable {
       }
     
       /**
    -   * Constructor: create the ExecutorMetrics with the values specified.
    +   * Constructor: create the ExecutorMetrics with using a given map.
        *
        * @param executorMetrics map of executor metric name to value
        */
       private[spark] def this(executorMetrics: Map[String, Long]) {
         this()
    -    (0 until ExecutorMetricType.values.length).foreach { idx =>
    -      metrics(idx) = 
executorMetrics.getOrElse(ExecutorMetricType.values(idx).name, 0L)
    +    ExecutorMetricType.metricToOffset.map { m =>
    +      metrics(m._2) = executorMetrics.getOrElse(m._1, 0L)
    --- End diff --
    
    you can use pattern matching here.  Also you're not returning anything from 
that loop, so `foreach` is more appropriate than `map`.
    
    ```scala
    .foreach { case(name, idx) =>
      metrics(idx) = executorsMetrics.getOrElse(name, 0L)
    }
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to