Thanks Anton. Thank you, Wenchen, Dongjoon, Ryan, Serge, Allison and
others if I miss those who are participating in the discussion.

I suppose we have reached a consensus or close to being in the design.

If you have some more comments, please let us know.

If not, I will go to start a vote soon after a few days.

Thank you.

On Thu, May 9, 2024 at 6:12 PM Anton Okolnychyi <aokolnyc...@gmail.com> wrote:
>
> Thanks to everyone who commented on the design doc. I updated the proposal 
> and it is ready for another look. I hope we can converge and move forward 
> with this effort!
>
> - Anton
>
> пт, 19 квіт. 2024 р. о 15:54 Anton Okolnychyi <aokolnyc...@gmail.com> пише:
>>
>> Hi folks,
>>
>> I'd like to start a discussion on SPARK-44167 that aims to enable catalogs 
>> to expose custom routines as stored procedures. I believe this functionality 
>> will enhance Spark’s ability to interact with external connectors and allow 
>> users to perform more operations in plain SQL.
>>
>> SPIP [1] contains proposed API changes and parser extensions. Any feedback 
>> is more than welcome!
>>
>> Unlike the initial proposal for stored procedures with Python [2], this one 
>> focuses on exposing pre-defined stored procedures via the catalog API. This 
>> approach is inspired by a similar functionality in Trino and avoids the 
>> challenges of supporting user-defined routines discussed earlier [3].
>>
>> Liang-Chi was kind enough to shepherd this effort. Thanks!
>>
>> - Anton
>>
>> [1] - 
>> https://docs.google.com/document/d/1rDcggNl9YNcBECsfgPcoOecHXYZOu29QYFrloo2lPBg/
>> [2] - 
>> https://docs.google.com/document/d/1ce2EZrf2BxHu7TjfGn4TgToK3TBYYzRkmsIVcfmkNzE/
>> [3] - https://lists.apache.org/thread/lkjm9r7rx7358xxn2z8yof4wdknpzg3l
>>
>>
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to