Repository: spark
Updated Branches:
  refs/heads/branch-2.2 0848df1bb -> 4304d0bf0


[SPARK-21950][SQL][PYTHON][TEST] pyspark.sql.tests.SQLTests2 should stop 
SparkContext.

## What changes were proposed in this pull request?

`pyspark.sql.tests.SQLTests2` doesn't stop newly created spark context in the 
test and it might affect the following tests.
This pr makes `pyspark.sql.tests.SQLTests2` stop `SparkContext`.

## How was this patch tested?

Existing tests.

Author: Takuya UESHIN <ues...@databricks.com>

Closes #19158 from ueshin/issues/SPARK-21950.

(cherry picked from commit 57bc1e9eb452284cbed090dbd5008eb2062f1b36)
Signed-off-by: Takuya UESHIN <ues...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4304d0bf
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/4304d0bf
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/4304d0bf

Branch: refs/heads/branch-2.2
Commit: 4304d0bf05eb51c13ae1b9ee9a2970a945b51cac
Parents: 0848df1
Author: Takuya UESHIN <ues...@databricks.com>
Authored: Fri Sep 8 14:26:07 2017 +0900
Committer: Takuya UESHIN <ues...@databricks.com>
Committed: Fri Sep 8 14:26:23 2017 +0900

----------------------------------------------------------------------
 python/pyspark/sql/tests.py | 8 ++++++--
 1 file changed, 6 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/4304d0bf/python/pyspark/sql/tests.py
----------------------------------------------------------------------
diff --git a/python/pyspark/sql/tests.py b/python/pyspark/sql/tests.py
index 20d9ca2..a100dc0 100644
--- a/python/pyspark/sql/tests.py
+++ b/python/pyspark/sql/tests.py
@@ -2252,8 +2252,12 @@ class SQLTests2(ReusedPySparkTestCase):
         self.sc.stop()
         sc = SparkContext('local[4]', self.sc.appName)
         spark = SparkSession.builder.getOrCreate()
-        df = spark.createDataFrame([(1, 2)], ["c", "c"])
-        df.collect()
+        try:
+            df = spark.createDataFrame([(1, 2)], ["c", "c"])
+            df.collect()
+        finally:
+            spark.stop()
+            sc.stop()
 
 
 class UDFInitializationTests(unittest.TestCase):


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to