Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20373#discussion_r163437299
  
    --- Diff: python/pyspark/cloudpickle.py ---
    @@ -344,24 +367,21 @@ def save_function(self, obj, name=None):
                     return self.save_global(obj, name)
     
             # a builtin_function_or_method which comes in as an attribute of 
some
    -        # object (e.g., object.__new__, itertools.chain.from_iterable) 
will end
    +        # object (e.g., itertools.chain.from_iterable) will end
             # up with modname "__main__" and so end up here. But these 
functions
             # have no __code__ attribute in CPython, so the handling for
             # user-defined functions below will fail.
             # So we pickle them here using save_reduce; have to do it 
differently
             # for different python versions.
             if not hasattr(obj, '__code__'):
                 if PY3:
    -                if sys.version_info < (3, 4):
    -                    raise pickle.PicklingError("Can't pickle %r" % obj)
    -                else:
    -                    rv = obj.__reduce_ex__(self.proto)
    +                rv = obj.__reduce_ex__(self.proto)
    --- End diff --
    
    Looks reasonable since Py 3.4+ only anyways.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to