thejdeep commented on pull request #34607:
URL: https://github.com/apache/spark/pull/34607#issuecomment-976532846


   > > I changed maybeUpdate to update when we encounter a speculative task. 
The problem with the previous approach was that - if we have a speculative 
task, then future task end events would all make unnecessary speculation 
summary writes to DB. What do you think ?
   > 
   > How about changing `speculationStageSummary` in `LiveStage` to `var 
speculationStageSummary: Option[LiveSpeculationStageSummary]` and then, we can 
check whether there is any update about speculation summary. If it's `Some`, we 
can update with `maybeUpdate`.
   
   @sarutak Would not really prefer it since it would require changing a class 
val to a var. I am assuming this is the change you suggested : 
   ```
   AppStatusListener.scala
   
         if (event.taskInfo.speculative) {
           stage.speculationStageSummary = 
Some(stage.speculationStageSummary.getOrElse(
               new LiveSpeculationStageSummary(event.stageId, 
event.stageAttemptId)))
           val speculationStageSummary = stage.speculationStageSummary.get
           speculationStageSummary.numActiveTasks -= 1
           speculationStageSummary.numCompletedTasks += completedDelta
           speculationStageSummary.numFailedTasks += failedDelta
           speculationStageSummary.numKilledTasks += killedDelta
         }
   
         if(stage.speculationStageSummary.isDefined) {
           maybeUpdate(stage.speculationStageSummary.get, now)
         }
   ```
   and 
   ```
   LiveEntity.scala
   
   var speculationStageSummary: Option[LiveSpeculationStageSummary] = None
   ```
   What do you think ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to