Ngone51 commented on code in PR #47197:
URL: https://github.com/apache/spark/pull/47197#discussion_r1663648767


##########
core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala:
##########
@@ -264,12 +264,37 @@ class TaskMetrics private[spark] () extends Serializable {
   /**
    * External accumulators registered with this task.
    */
-  @transient private[spark] lazy val _externalAccums = new 
CopyOnWriteArrayList[AccumulatorV2[_, _]]
+  @transient private[spark] lazy val _externalAccums = new 
ArrayBuffer[AccumulatorV2[_, _]]
 
-  private[spark] def externalAccums = _externalAccums.asScala
+  private[spark] def externalAccums = withReadLock {
+    _externalAccums
+  }

Review Comment:
   b) sounds good to me.
   
   Just a bit concern for "not use externalAccums directly", does it mean for 
all the usages upon `accumulators()` should be written within `TaskMetrics` 
instead?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to