Github user zero323 commented on the issue:

    https://github.com/apache/spark/pull/17831
  
    @gatorsmile This sounds reasonable but I am not sure if I fully understand 
your concerns.
    
    If anything this brings PySpark closer to the Scala API. At this moment we 
have
    
    ```
     registerFunction(self, name: str, f: Callable[[T], U], returnType: 
DataType) -> None: ...
    ```
    
    and we would move to:
    
    ```
     registerFunction(self, name: str, f: Callable[[T], U], returnType: 
DataType) -> Callable[[Column, ...], Column]: ...
    ```
    
    This, as pointed out by @holdenk, matches `register` API for `Function0` .. 
`Function22`.
    
    If you're planning breaking changes in the Scala API, it may render this PR 
obsolete, but we don't commit here to any particular implementation. The only 
promise here is that registering udf for SQL applications, returns an object, 
which can be used with `DataFrame` API.  I believe this sounds like a 
reasonable requirement for any upcoming API. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to