Hello,
I'm looking for a detailed description of the shuffle operation in Spark, something that would explain what are the criteria for assigning blocks to nodes, how many go where, what happens when there are memory constraints, etc. If anyone knows of such a document I'd appreciate a link (or a detailed answer).
Thanks a lot,

Joe


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to