Github user zero323 commented on the issue: https://github.com/apache/spark/pull/16533 @rdblue Good question. I vaguely remember I had some motivation to avoid this but now I cannot recall why. In general I really liked the initial approach because it allowed us to write: ```python @udf(IntegerType()) def identity(x): return x ``` instead of ```python @udf(returnType=IntegerType()) def identity(x): return x ``` There is another trick, which some libraries that depend heavily on decorators, use: ```python @functools.wraps(_udf) def udf(f=None, returnType=StringType()): """A decorator version of pyspark.sql.functions.udf """ if f is None or isinstance(f, DataType): return functools.partial(_udf, returnType=returnType) else: return _udf(f=f, returnType=returnType) ``` but it smells. I think that: ```python @functools.wraps(_udf) def udf(f=None, returnType=StringType()): """A decorator version of pyspark.sql.functions.udf """ if isinstance(f, DataType): raise TypeError("returnType with decorator should be provided as a keyword argument") if f is None: return functools.partial(_udf, returnType=returnType) else: return _udf(f=f, returnType=returnType) ``` could be an acceptable trade-off. It doesn't brake current API and clearly communicates possible issues.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org