Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/21311
  
    > Calculate the new size simply by multiplying by 2
    At this time, the size of the application may not be enough to store data
    Some data is lost and the data read out is dirty
    
    Can you explain more about it? IIUC if we don't have enough memory for 
`size * 2`, we would just fail with OOM, instead of setting a wrong size.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to