Github user carsonwang commented on the issue:

    https://github.com/apache/spark/pull/22907
  
    What if there is a FetchFailure and Spark reruns some tasks in the previous 
succeeded shuffle map stage? That will be a new ShuffleMapStage and we will 
still double counting the accumulators, right?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to