Ngone51 commented on code in PR #47197:
URL: https://github.com/apache/spark/pull/47197#discussion_r1670105206


##########
core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala:
##########
@@ -328,19 +354,16 @@ private[spark] object TaskMetrics extends Logging {
    */
   def fromAccumulators(accums: Seq[AccumulatorV2[_, _]]): TaskMetrics = {
     val tm = new TaskMetrics
-    val externalAccums = new java.util.ArrayList[AccumulatorV2[Any, Any]]()
     for (acc <- accums) {
       val name = acc.name
-      val tmpAcc = acc.asInstanceOf[AccumulatorV2[Any, Any]]
       if (name.isDefined && tm.nameToAccums.contains(name.get)) {
         val tmAcc = tm.nameToAccums(name.get).asInstanceOf[AccumulatorV2[Any, 
Any]]
         tmAcc.metadata = acc.metadata
-        tmAcc.merge(tmpAcc)
+        tmAcc.merge(acc.asInstanceOf[AccumulatorV2[Any, Any]])
       } else {
-        externalAccums.add(tmpAcc)
+        tm._externalAccums += acc

Review Comment:
   I didn't use the write lock by design as `tm` is a local variable so 
theoretically there won't be thread-safe issue for the write operation on `tm. 
_externalAccums`. Using write lock to be defensive is also fine given it's fast 
since there is no lock competition with other threads.
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to