Github user zero323 commented on the issue:
https://github.com/apache/spark/pull/16533
@rdblue Unified `udf` which supports all cases (`udf(f)`, decorator with
parentheses and without, decorator with `returnType`) could be implemented like
this:
```python
def _udf(f, returnType=StringType()):
"""Creates a :class:`Column` expression representing a user defined
function (UDF).
.. note:: The user-defined functions must be deterministic. Due to
optimization,
duplicate invocations may be eliminated or the function may even be
invoked more times than
it is present in the query.
:param f: python function
:param returnType: a :class:`pyspark.sql.types.DataType` object
>>> from pyspark.sql.types import IntegerType
>>> slen = udf(lambda s: len(s), IntegerType())
>>> df.select(slen(df.name).alias('slen')).collect()
[Row(slen=5), Row(slen=3)]
>>> @udf
... def to_upper(s):
... if s is not None:
... return s.upper()
...
>>> @udf(returnType=IntegerType())
... def add_one(x):
... if x is not None:
... return x + 1
...
"""
return UserDefinedFunction(f, returnType)
@functools.wraps(_udf)
def udf(f=None, **kwargs):
"""A decorator version of pyspark.sql.functions.udf
"""
returnType = kwargs.get("returnType", StringType())
if f is None:
return functools.partial(_udf, returnType=returnType)
else:
return _udf(f=f, returnType=returnType)
```
We can use `functools.wraps` to update docstring. There are two problems
with this design:
- [Once
again](https://github.com/apache/spark/pull/16534#issuecomment-271735937) in
legacy Python we won't get meaningful argument list.
- It will break pre-existing code which consider `returnType` to be a
positional argument (including a bunch of tests).
Do you think this is an acceptable trade-off?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]