Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22612#discussion_r237978529
  
    --- Diff: 
core/src/main/scala/org/apache/spark/executor/ProcfsMetricsGetter.scala ---
    @@ -0,0 +1,228 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.executor
    +
    +import java.io._
    +import java.nio.charset.Charset
    +import java.nio.file.{Files, Paths}
    +import java.util.Locale
    +
    +import scala.collection.mutable
    +import scala.collection.mutable.ArrayBuffer
    +import scala.util.Try
    +
    +import org.apache.spark.{SparkEnv, SparkException}
    +import org.apache.spark.internal.{config, Logging}
    +import org.apache.spark.util.Utils
    +
    +
    +private[spark] case class ProcfsMetrics(
    +    jvmVmemTotal: Long,
    +    jvmRSSTotal: Long,
    +    pythonVmemTotal: Long,
    +    pythonRSSTotal: Long,
    +    otherVmemTotal: Long,
    +    otherRSSTotal: Long)
    +
    +// Some of the ideas here are taken from the ProcfsBasedProcessTree class 
in hadoop
    +// project.
    +private[spark] class ProcfsMetricsGetter(
    +    val procfsDir: String = "/proc/",
    +    val pSizeForTest: Long = 0) extends Logging {
    --- End diff --
    
    I think its pretty weird having this constructor argument only used for 
tests.  I'd either (a) always use this argument (and make the page size 
computed in the default constructor), or (b) don't make this a constructor 
argument at all, and just hardcode the value in `computePageSize` if testing 
(you only set it to one value during testing, we don't need it parameterizable 
more than that currently).
    
    (b) should be pretty easy.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to