Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4022#discussion_r23072648
  
    --- Diff: docs/programming-guide.md ---
    @@ -1316,7 +1316,35 @@ For accumulator updates performed inside <b>actions 
only</b>, Spark guarantees t
     will only be applied once, i.e. restarted tasks will not update the value. 
In transformations, users should be aware 
     of that each task's update may be applied more than once if tasks or job 
stages are re-executed.
     
    +In addition, accumulators do not maintain lineage for the operations that 
use them. Consequently, accumulator updates are not guaranteed to be executed 
when made within a lazy transformation like `map()`. Unless something has 
triggered the evaluation of the lazy transformation that updates the value of 
the accumlator, subsequent operations will not themselves trigger that 
evaluation and the value of the accumulator will remain unchanged. The below 
code fragment demonstrates this issue:
    --- End diff --
    
    I found this is worded a bit confusingly: what would it mean for an 
accumulator to "maintain lineage"? I think this is from @JoshRosen's PR 
description, but IMO it might be better to remove that particular phrasing. 
What about a slight re-wording:
    
    ```
    Accumulators do not change the lazy evaluation model of Spark. Their value 
is only updated once the RDD in which they are being modified is computed as 
part of an action. The below code fragment demonstrates this property:
    ```
    
    I also didn't call it an "issue" because it's just a property of how they 
work, I don't think it's necessarily a bug.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to