Hi,

If the underlying table changes (DDL), if I recall from RDBMSs like Oracle,
the stored procedure will be invalidated as it is a compiled object. How is
this going to be handled? Does it follow the same mechanism?

Thanks

Mich Talebzadeh,
Technologist | Architect | Data Engineer  | Generative AI | FinCrime
London
United Kingdom


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* The information provided is correct to the best of my
knowledge but of course cannot be guaranteed . It is essential to note
that, as with any advice, quote "one test result is worth one-thousand
expert opinions (Werner  <https://en.wikipedia.org/wiki/Wernher_von_Braun>Von
Braun <https://en.wikipedia.org/wiki/Wernher_von_Braun>)".


On Sat, 20 Apr 2024 at 02:34, Anton Okolnychyi <aokolnyc...@gmail.com>
wrote:

> Hi folks,
>
> I'd like to start a discussion on SPARK-44167 that aims to enable catalogs
> to expose custom routines as stored procedures. I believe this
> functionality will enhance Spark’s ability to interact with external
> connectors and allow users to perform more operations in plain SQL.
>
> SPIP [1] contains proposed API changes and parser extensions. Any feedback
> is more than welcome!
>
> Unlike the initial proposal for stored procedures with Python [2], this
> one focuses on exposing pre-defined stored procedures via the catalog API.
> This approach is inspired by a similar functionality in Trino and avoids
> the challenges of supporting user-defined routines discussed earlier [3].
>
> Liang-Chi was kind enough to shepherd this effort. Thanks!
>
> - Anton
>
> [1] -
> https://docs.google.com/document/d/1rDcggNl9YNcBECsfgPcoOecHXYZOu29QYFrloo2lPBg/
> [2] -
> https://docs.google.com/document/d/1ce2EZrf2BxHu7TjfGn4TgToK3TBYYzRkmsIVcfmkNzE/
> [3] - https://lists.apache.org/thread/lkjm9r7rx7358xxn2z8yof4wdknpzg3l
>
>
>
>

Reply via email to