Repository: spark
Updated Branches:
  refs/heads/branch-2.1 cdf315ba1 -> c46928ff9


[SPARK-18523][PYSPARK] Make SparkContext.stop more reliable

## What changes were proposed in this pull request?

This PR fixes SparkContext broken state in which it may fall if spark driver 
get crashed or killed by OOM.

## How was this patch tested?

1. Start SparkContext;
2. Find Spark driver process and `kill -9` it;
3. Call `sc.stop()`;
4. Create new SparkContext after that;

Without this patch you will crash on step 3 and won't be able to do step 4 
without manual reset private attibutes or IPython notebook / shell restart.

Author: Alexander Shorin <kxe...@apache.org>

Closes #15961 from kxepal/18523-make-spark-context-stop-more-reliable.

(cherry picked from commit 71352c94ad2a60d1695bd7ac0f4452539270e10c)
Signed-off-by: Reynold Xin <r...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c46928ff
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c46928ff
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c46928ff

Branch: refs/heads/branch-2.1
Commit: c46928ff97371421613720a0d8d7f2baaa64bb73
Parents: cdf315b
Author: Alexander Shorin <kxe...@apache.org>
Authored: Mon Nov 28 18:28:24 2016 -0800
Committer: Reynold Xin <r...@databricks.com>
Committed: Mon Nov 28 18:28:29 2016 -0800

----------------------------------------------------------------------
 python/pyspark/context.py | 17 +++++++++++++++--
 1 file changed, 15 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/c46928ff/python/pyspark/context.py
----------------------------------------------------------------------
diff --git a/python/pyspark/context.py b/python/pyspark/context.py
index 2fd3aee..5c4e79c 100644
--- a/python/pyspark/context.py
+++ b/python/pyspark/context.py
@@ -26,6 +26,8 @@ import warnings
 from threading import RLock
 from tempfile import NamedTemporaryFile
 
+from py4j.protocol import Py4JError
+
 from pyspark import accumulators
 from pyspark.accumulators import Accumulator
 from pyspark.broadcast import Broadcast
@@ -373,8 +375,19 @@ class SparkContext(object):
         Shut down the SparkContext.
         """
         if getattr(self, "_jsc", None):
-            self._jsc.stop()
-            self._jsc = None
+            try:
+                self._jsc.stop()
+            except Py4JError:
+                # Case: SPARK-18523
+                warnings.warn(
+                    'Unable to cleanly shutdown Spark JVM process.'
+                    ' It is possible that the process has crashed,'
+                    ' been killed or may also be in a zombie state.',
+                    RuntimeWarning
+                )
+                pass
+            finally:
+                self._jsc = None
         if getattr(self, "_accumulatorServer", None):
             self._accumulatorServer.shutdown()
             self._accumulatorServer = None


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to