[ https://issues.apache.org/jira/browse/SPARK-21432?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
holdenk reassigned SPARK-21432: ------------------------------- Assignee: Hyukjin Kwon > Reviving broken partial functions in UDF in PySpark > --------------------------------------------------- > > Key: SPARK-21432 > URL: https://issues.apache.org/jira/browse/SPARK-21432 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.3.0 > Reporter: Hyukjin Kwon > Assignee: Hyukjin Kwon > Fix For: 2.3.0 > > > This is related with SPARK-21394 > We also happened to break partial function support in UDF. > Spark 2.1: > {code} > >>> from pyspark.sql import functions > >>> from functools import partial > >>> > >>> > >>> partial_func = partial(lambda x: x, x=1) > >>> udf = functions.udf(partial_func) > >>> spark.range(1).select(udf()).show() > +---------+ > |partial()| > +---------+ > | 1| > +---------+ > {code} > master: > {code} > >>> from pyspark.sql import functions > >>> from functools import partial > >>> > >>> > >>> partial_func = partial(lambda x: x, x=1) > >>> udf = functions.udf(partial_func) > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > File ".../spark/python/pyspark/sql/functions.py", line 2154, in udf > return _udf(f=f, returnType=returnType) > File ".../spark/python/pyspark/sql/functions.py", line 2145, in _udf > return udf_obj._wrapped() > File ".../spark/python/pyspark/sql/functions.py", line 2099, in _wrapped > @functools.wraps(self.func, assigned=assignments) > File > "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/functools.py", > line 33, in update_wrapper > setattr(wrapper, attr, getattr(wrapped, attr)) > AttributeError: 'functools.partial' object has no attribute '__module__' > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org