Tony-huangweiyi commented on a change in pull request #25949: [SPARK-29273] set 
the right peakExecutionMemory value when the task end
URL: https://github.com/apache/spark/pull/25949#discussion_r329337420
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/util/JsonProtocol.scala
 ##########
 @@ -647,6 +647,13 @@ private[spark] object JsonProtocol {
     val taskInfo = taskInfoFromJson(json \ "Task Info")
     val executorMetrics = executorMetricsFromJson(json \ "Task Executor 
Metrics")
     val taskMetrics = taskMetricsFromJson(json \ "Task Metrics")
+    val peakExecutionMemory = taskInfo.accumulables.find(accInfo => {
 
 Review comment:
   hi, @vanzin , I update this PR
   
   the peakExecutionMemory value zero happend when replaying event log, so I 
add the metrics value in SparkListenerTaskEnd json info and parsing it when 
replaying and set the value back like other metrics do, this is a litter more 
natural than the first commit for this PR
   
   the unit test is the same as JsonProtocolSuite.testTaskMetrics 
   
   could you please kindly review?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to