This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 1ea156169ce8 [SPARK-48319][PYTHON][CONNECT][TESTS] Test `assert_true` 
and `raise_error` with the same error class as Spark Classic
1ea156169ce8 is described below

commit 1ea156169ce87dad31c3241bd1ffeb63bc3ac60d
Author: Ruifeng Zheng <ruife...@apache.org>
AuthorDate: Fri May 17 17:03:24 2024 +0900

    [SPARK-48319][PYTHON][CONNECT][TESTS] Test `assert_true` and `raise_error` 
with the same error class as Spark Classic
    
    ### What changes were proposed in this pull request?
    Test `assert_true` and `raise_error` with the same error class as Spark 
Classic
    
    ### Why are the changes needed?
    
https://github.com/apache/spark/commit/578931678f5a6d6b33ebdae4bf866871e46fbb83 
made `assert_true` and `raise_error` in Spark Connect throw 
`SparkRuntimeException`, then the error is the same as Spark Classic
    
    ### Does this PR introduce _any_ user-facing change?
    no, test only
    
    ### How was this patch tested?
    ci
    
    ### Was this patch authored or co-authored using generative AI tooling?
    no
    
    Closes #46633 from zhengruifeng/test_assert_raise.
    
    Authored-by: Ruifeng Zheng <ruife...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 python/pyspark/sql/tests/connect/test_parity_functions.py | 8 --------
 1 file changed, 8 deletions(-)

diff --git a/python/pyspark/sql/tests/connect/test_parity_functions.py 
b/python/pyspark/sql/tests/connect/test_parity_functions.py
index 4fa1cf31b3b6..9dfde11ecbcc 100644
--- a/python/pyspark/sql/tests/connect/test_parity_functions.py
+++ b/python/pyspark/sql/tests/connect/test_parity_functions.py
@@ -21,14 +21,10 @@ from pyspark.sql.tests.test_functions import 
FunctionsTestsMixin
 from pyspark.testing.connectutils import should_test_connect, 
ReusedConnectTestCase
 
 if should_test_connect:
-    from pyspark.errors.exceptions.connect import SparkConnectException
     from pyspark.sql.connect.column import Column
 
 
 class FunctionsParityTests(FunctionsTestsMixin, ReusedConnectTestCase):
-    def test_assert_true(self):
-        self.check_assert_true(SparkConnectException)
-
     @unittest.skip("Spark Connect does not support Spark Context but the test 
depends on that.")
     def test_basic_functions(self):
         super().test_basic_functions()
@@ -41,15 +37,11 @@ class FunctionsParityTests(FunctionsTestsMixin, 
ReusedConnectTestCase):
     def test_input_file_name_reset_for_rdd(self):
         super().test_input_file_name_reset_for_rdd()
 
-    def test_raise_error(self):
-        self.check_raise_error(SparkConnectException)
-
     def test_sorting_functions_with_column(self):
         self.check_sorting_functions_with_column(Column)
 
 
 if __name__ == "__main__":
-    import unittest
     from pyspark.sql.tests.connect.test_parity_functions import *  # noqa: F401
 
     try:


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to