This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-4.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.1 by this push:
     new 0fb91ff2d211 [SPARK-54269][PYTHON] Upgrade `cloudpickle` to 3.1.2 for 
Python 3.14
0fb91ff2d211 is described below

commit 0fb91ff2d2118b37bdbe511fed3c5320489c2f90
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Sun Nov 9 22:41:26 2025 -0800

    [SPARK-54269][PYTHON] Upgrade `cloudpickle` to 3.1.2 for Python 3.14
    
    ### What changes were proposed in this pull request?
    
    This PR aims to upgrade `cloudpickle` to 3.1.2.
    
    ### Why are the changes needed?
    
    To support Python 3.14 properly.
    - https://github.com/cloudpipe/cloudpickle/releases/tag/v3.1.2
    - https://github.com/cloudpipe/cloudpickle/blob/master/CHANGES.md#312
    
        > Fix pickling of abstract base classes containing type annotations for 
Python 3.14. (https://github.com/cloudpipe/cloudpickle/pull/578)
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, Python 3.14 support is not announced yet.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #52964 from dongjoon-hyun/SPARK-54269.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit 9754e1750614ab05bcf79055f3ac1d883314ebb9)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/cloudpickle/__init__.py    |  2 +-
 python/pyspark/cloudpickle/cloudpickle.py | 19 +++++++++++++------
 2 files changed, 14 insertions(+), 7 deletions(-)

diff --git a/python/pyspark/cloudpickle/__init__.py 
b/python/pyspark/cloudpickle/__init__.py
index bdb1738611b3..052b6e975a77 100644
--- a/python/pyspark/cloudpickle/__init__.py
+++ b/python/pyspark/cloudpickle/__init__.py
@@ -3,7 +3,7 @@ from pyspark.cloudpickle.cloudpickle import *  # noqa
 
 __doc__ = cloudpickle.__doc__
 
-__version__ = "3.1.1"
+__version__ = "3.1.2"
 
 __all__ = [  # noqa
     "__version__",
diff --git a/python/pyspark/cloudpickle/cloudpickle.py 
b/python/pyspark/cloudpickle/cloudpickle.py
index 4d532e5de9f2..e600b35f2842 100644
--- a/python/pyspark/cloudpickle/cloudpickle.py
+++ b/python/pyspark/cloudpickle/cloudpickle.py
@@ -783,6 +783,12 @@ def _class_getstate(obj):
 
     clsdict.pop("__dict__", None)  # unpicklable property object
 
+    if sys.version_info >= (3, 14):
+        # PEP-649/749: __annotate_func__ contains a closure that references 
the class
+        # dict. We need to exclude it from pickling. Python will recreate it 
when
+        # __annotations__ is accessed at unpickling time.
+        clsdict.pop("__annotate_func__", None)
+
     return (clsdict, {})
 
 
@@ -1190,6 +1196,10 @@ def _class_setstate(obj, state):
         for subclass in registry:
             obj.register(subclass)
 
+    # PEP-649/749: During pickling, we excluded the __annotate_func__ 
attribute but it
+    # will be created by Python. Subsequently, annotations will be recreated 
when
+    # __annotations__ is accessed.
+
     return obj
 
 
@@ -1301,12 +1311,9 @@ class Pickler(pickle.Pickler):
     def dump(self, obj):
         try:
             return super().dump(obj)
-        except RuntimeError as e:
-            if len(e.args) > 0 and "recursion" in e.args[0]:
-                msg = "Could not pickle object as excessively deep recursion 
required."
-                raise pickle.PicklingError(msg) from e
-            else:
-                raise
+        except RecursionError as e:
+            msg = "Could not pickle object as excessively deep recursion 
required."
+            raise pickle.PicklingError(msg) from e
 
     def __init__(self, file, protocol=None, buffer_callback=None):
         if protocol is None:


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to