This is an automated email from the ASF dual-hosted git repository.
ruifengz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 64785022e701 [MINOR][PYTHON][CONNECT][TESTS] Add `@timeout` to fail
suspicious tests
64785022e701 is described below
commit 64785022e7019a5f39f913cd311fd05e63e550b4
Author: Ruifeng Zheng <[email protected]>
AuthorDate: Thu Mar 27 11:53:09 2025 +0800
[MINOR][PYTHON][CONNECT][TESTS] Add `@timeout` to fail suspicious tests
### What changes were proposed in this pull request?
Add `timeout` to fail suspicious tests
### Why are the changes needed?
these tests are potentially cause of timeout
https://github.com/apache/spark/actions/runs/14044733934/job/39322973616
### Does this PR introduce _any_ user-facing change?
no, test only
### How was this patch tested?
manually check and ci
### Was this patch authored or co-authored using generative AI tooling?
no
Closes #50420 from zhengruifeng/py_test_timeout_x.
Authored-by: Ruifeng Zheng <[email protected]>
Signed-off-by: Ruifeng Zheng <[email protected]>
---
python/pyspark/sql/tests/connect/test_connect_session.py | 4 +++-
1 file changed, 3 insertions(+), 1 deletion(-)
diff --git a/python/pyspark/sql/tests/connect/test_connect_session.py
b/python/pyspark/sql/tests/connect/test_connect_session.py
index d9d865803b5c..1857796ac9aa 100644
--- a/python/pyspark/sql/tests/connect/test_connect_session.py
+++ b/python/pyspark/sql/tests/connect/test_connect_session.py
@@ -54,6 +54,7 @@ class SparkConnectSessionTests(ReusedConnectTestCase):
def tearDown(self):
self.spark.stop()
+ @timeout(10)
def test_progress_handler(self):
handler_called = []
@@ -78,7 +79,7 @@ class SparkConnectSessionTests(ReusedConnectTestCase):
def _check_no_active_session_error(self, e: PySparkException):
self.check_error(exception=e, errorClass="NO_ACTIVE_SESSION",
messageParameters=dict())
- @timeout(3)
+ @timeout(10)
def test_stop_session(self):
df = self.spark.sql("select 1 as a, 2 as b")
catalog = self.spark.catalog
@@ -226,6 +227,7 @@ class SparkConnectSessionTests(ReusedConnectTestCase):
self.assertIsNotNone(exception)
self.assertEqual(exception.getMessageParameters(), {"objectName":
"`a`"})
+ @timeout(10)
def test_custom_channel_builder(self):
# Access self.spark's DefaultChannelBuilder to reuse same endpoint
endpoint = self.spark._client._builder.endpoint
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]