GitHub user sitalkedia opened a pull request:

    https://github.com/apache/spark/pull/20014

    [SPARK-22827][CORE] Avoid throwing OutOfMemoryError in case of exception in 
spill

    ## What changes were proposed in this pull request?
    Currently, the task memory manager throws an OutofMemory error when there 
is an IO exception happens in spill() - 
https://github.com/apache/spark/blob/master/core/src/main/java/org/apache/spark/memory/TaskMemoryManager.java#L194.
 Similarly there any many other places in code when if a task is not able to 
acquire memory due to an exception we throw an OutofMemory error which kills 
the entire executor and hence failing all the tasks that are running on that 
executor instead of just failing one single task.
    
    
    ## How was this patch tested?
    
    Unit tests


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sitalkedia/spark skedia/upstream_SPARK-22827

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20014.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20014
    
----
commit 91c925795e4ccbe6c9ccf68f99ed8994ebd92a4b
Author: Sital Kedia <ske...@fb.com>
Date:   2017-12-15T07:38:29Z

    [SPARK-22827][CORE] Avoid throwing OutOfMemoryError in case of exception in 
spill

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to