[ 
https://issues.apache.org/jira/browse/SPARK-45987?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-45987.
-----------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

Issue resolved by pull request 43885
[https://github.com/apache/spark/pull/43885]

> Fix `pyspark.sql.tests.connect.test_connect_basic` in Python 3.11
> -----------------------------------------------------------------
>
>                 Key: SPARK-45987
>                 URL: https://issues.apache.org/jira/browse/SPARK-45987
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark
>    Affects Versions: 4.0.0
>            Reporter: Dongjoon Hyun
>            Assignee: Dongjoon Hyun
>            Priority: Major
>             Fix For: 4.0.0
>
>
> https://github.com/apache/spark/actions/runs/6914662405/job/18812759788
> {code}
> ======================================================================
> ERROR [3.529s]: test_recursion_handling_for_plan_logging 
> (pyspark.sql.tests.connect.test_connect_basic.SparkConnectBasicTests.test_recursion_handling_for_plan_logging)
> SPARK-45852 - Test that we can handle recursion in plan logging.
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File 
> "/__w/spark/spark/python/pyspark/sql/tests/connect/test_connect_basic.py", 
> line 171, in test_recursion_handling_for_plan_logging
>     self.assertIsNotNone(cdf.schema)
>                          ^^^^^^^^^^
>   File "/__w/spark/spark/python/pyspark/sql/connect/dataframe.py", line 1735, 
> in schema
>     return self._session.client.schema(query)
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 
> 924, in schema
>     schema = self._analyze(method="schema", plan=plan).schema
>              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>   File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 
> 1110, in _analyze
>     self._handle_error(error)
>   File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 
> 1499, in _handle_error
>     self._handle_rpc_error(error)
>   File "/__w/spark/spark/python/pyspark/sql/connect/client/core.py", line 
> 1570, in _handle_rpc_error
>     raise SparkConnectGrpcException(str(rpc_error)) from None
> pyspark.errors.exceptions.connect.SparkConnectGrpcException: 
> <_InactiveRpcError of RPC that terminated with:
>       status = StatusCode.INTERNAL
>       details = "Exception serializing request!"
>       debug_error_string = "None"
> >
> ----------------------------------------------------------------------
> Ran 141 tests in 86.259s
> FAILED (errors=1)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to