Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12472#discussion_r60175666
  
    --- Diff: core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala 
---
    @@ -268,23 +243,11 @@ private[spark] class ListenerTaskMetrics(
     }
     
     private[spark] object TaskMetrics extends Logging {
    +  import InternalAccumulator._
     
       def empty: TaskMetrics = new TaskMetrics
     
    -  /**
    -   * Get an accumulator from the given map by name, assuming it exists.
    -   */
    -  def getAccum[T](accumMap: Map[String, Accumulator[_]], name: String): 
Accumulator[T] = {
    -    require(accumMap.contains(name), s"metric '$name' is missing")
    -    val accum = accumMap(name)
    -    try {
    -      // Note: we can't do pattern matching here because types are erased 
by compile time
    -      accum.asInstanceOf[Accumulator[T]]
    -    } catch {
    -      case e: ClassCastException =>
    -        throw new SparkException(s"accumulator $name was of unexpected 
type", e)
    -    }
    -  }
    +  def createAccum[T](name: String): Accumulator[T] = 
create(name).asInstanceOf[Accumulator[T]]
    --- End diff --
    
    I'd move the creation of accumulators in here, rather than delegating to 
InternalAccumulators.
    
    Also maybe just have createLongAccumulator and createCollectionAccumulator; 
then it becomes obvious at the callsite what's going on, and we also don't need 
to have conditional branches in create.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to