HeartSaVioR edited a comment on pull request #30139:
URL: https://github.com/apache/spark/pull/30139#issuecomment-718410305


   I somehow revisited the related configuration 
`spark.shuffle.maxChunksBeingTransferred` and realized the default value is set 
to `Long.MAX_VALUE`. (effectively off by default)
   
   That leads me to wonder, does the value (`numChunksBeingTransferred`) need 
to be accurate at the specific time? Do we really need to make both updates be 
in same critical section? Can we tolerate the skew and allow eventual 
consistent of the value?
   (If the answer is yes, the initial version of the patch becomes OK.)
   
   For the concern of being diverged, I guess updates to both would happen 
anyway unless there's no bug (I agree this is a tricky one to guarantee), as we 
anyway do the update on both sequentially. Please correct me if I'm missing 
something.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to