This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new c9df53f5d3ce [SPARK-46241][PYTHON][CONNECT] Fix error handling routine 
so it wouldn't fall into infinite recursion
c9df53f5d3ce is described below

commit c9df53f5d3ce7b286c0e314e51eb6e9612dd450f
Author: Alice Sayutina <alice.sayut...@databricks.com>
AuthorDate: Wed Dec 6 08:40:28 2023 +0900

    [SPARK-46241][PYTHON][CONNECT] Fix error handling routine so it wouldn't 
fall into infinite recursion
    
    ### What changes were proposed in this pull request?
    
    Remove _display_server_stack_trace and always display error stack trace if 
we have one
    
    ### Why are the changes needed?
    
    There is a certain codepath which can make existing error handling fall 
into infinite recursion. I.e. consider following codepath:
    
    `[Some error happens] -> _handle_error -> _handle_rpc_error -> 
_display_server_stack_trace -> RuntimeConf.get -> SparkConnectClient.config -> 
[An error happens] -> _handle_error`.
    
    There can be other similar codepaths
    
    ### Does this PR introduce _any_ user-facing change?
    
    Gets rid of occasionally infinite recursive error handling (which can cause 
downgraded user experience)
    
    ### How was this patch tested?
    N/A
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #44144 from cdkrot/forbid_recursive_error_handling.
    
    Authored-by: Alice Sayutina <alice.sayut...@databricks.com>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 python/pyspark/sql/connect/client/core.py | 16 +---------------
 1 file changed, 1 insertion(+), 15 deletions(-)

diff --git a/python/pyspark/sql/connect/client/core.py 
b/python/pyspark/sql/connect/client/core.py
index e36b7d74a787..0b502494f781 100644
--- a/python/pyspark/sql/connect/client/core.py
+++ b/python/pyspark/sql/connect/client/core.py
@@ -1520,20 +1520,6 @@ class SparkConnectClient(object):
         except grpc.RpcError:
             return None
 
-    def _display_server_stack_trace(self) -> bool:
-        from pyspark.sql.connect.conf import RuntimeConf
-
-        conf = RuntimeConf(self)
-        try:
-            if conf.get("spark.sql.connect.serverStacktrace.enabled") == 
"true":
-                return True
-            return conf.get("spark.sql.pyspark.jvmStacktrace.enabled") == 
"true"
-        except Exception as e:  # noqa: F841
-            # Falls back to true if an exception occurs during reading the 
config.
-            # Otherwise, it will recursively try to get the conf when it 
consistently
-            # fails, ending up with `RecursionError`.
-            return True
-
     def _handle_rpc_error(self, rpc_error: grpc.RpcError) -> NoReturn:
         """
         Error handling helper for dealing with GRPC Errors. On the server 
side, certain
@@ -1567,7 +1553,7 @@ class SparkConnectClient(object):
                         info,
                         status.message,
                         self._fetch_enriched_error(info),
-                        self._display_server_stack_trace(),
+                        True,
                     ) from None
 
             raise SparkConnectGrpcException(status.message) from None


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to