[GitHub] spark pull request #23084: [SPARK-26117][CORE][SQL]use SparkOutOfMemoryError...

2018-11-23 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/23084


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23084: [SPARK-26117][CORE][SQL]use SparkOutOfMemoryError...

2018-11-19 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/23084#discussion_r234661975
  
--- Diff: 
core/src/main/java/org/apache/spark/unsafe/map/BytesToBytesMap.java ---
@@ -741,7 +742,7 @@ public boolean append(Object kbase, long koff, int 
klen, Object vbase, long voff
 if (numKeys >= growthThreshold && longArray.size() < MAX_CAPACITY) 
{
   try {
 growAndRehash();
-  } catch (OutOfMemoryError oom) {
+  } catch (SparkOutOfMemoryError oom) {
--- End diff --

do you know what was the behavior before? Will we propagate the exception 
all the way up and kill the executor?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #23084: [SPARK-26117][CORE][SQL]use SparkOutOfMemoryError...

2018-11-19 Thread heary-cao
GitHub user heary-cao opened a pull request:

https://github.com/apache/spark/pull/23084

[SPARK-26117][CORE][SQL]use SparkOutOfMemoryError instead of 
OutOfMemoryError when catch exception

## What changes were proposed in this pull request?

the pr #20014 which introduced `SparkOutOfMemoryError` to avoid killing the 
entire executor when an `OutOfMemoryError `is thrown.
so apply for memory using `MemoryConsumer. allocatePage `when  catch 
exception, use `SparkOutOfMemoryError `instead of `OutOfMemoryError`

## How was this patch tested?
N / A

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/heary-cao/spark SparkOutOfMemoryError

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/23084.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #23084


commit 73459423d0c3a5070a143865bcf6557d5b73f423
Author: caoxuewen 
Date:   2018-11-19T12:53:34Z

use SparkOutOfMemoryError instead of OutOfMemoryError when catch exception




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org