Thanks Shuyi!

I left some comments there. I think the design of SQL DDL and Flink-Hive
integration/External catalog enhancements will work closely with each
other. Hope we are well aligned on the directions of the two designs, and I
look forward to working with you guys on both!

Bowen


On Thu, Nov 1, 2018 at 10:57 PM Shuyi Chen <suez1...@gmail.com> wrote:

> Hi everyone,
>
> SQL DDL support has been a long-time ask from the community. Current Flink
> SQL support only DML (e.g. SELECT and INSERT statements). In its current
> form, Flink SQL users still need to define/create table sources and sinks
> programmatically in Java/Scala. Also, in SQL Client, without DDL support,
> the current implementation does not allow dynamical creation of table, type
> or functions with SQL, this adds friction for its adoption.
>
> I drafted a design doc [1] with a few other community members that proposes
> the design and implementation for adding DDL support in Flink. The initial
> design considers DDL for table, view, type, library and function. It will
> be great to get feedback on the design from the community, and align with
> latest effort in unified SQL connector API  [2] and Flink Hive integration
> [3].
>
> Any feedback is highly appreciated.
>
> Thanks
> Shuyi Chen
>
> [1]
>
> https://docs.google.com/document/d/1TTP-GCC8wSsibJaSUyFZ_5NBAHYEB1FVmPpP7RgDGBA/edit?usp=sharing
> [2]
>
> https://docs.google.com/document/d/1Yaxp1UJUFW-peGLt8EIidwKIZEWrrA-pznWLuvaH39Y/edit?usp=sharing
> [3]
>
> https://docs.google.com/document/d/1SkppRD_rE3uOKSN-LuZCqn4f7dz0zW5aa6T_hBZq5_o/edit?usp=sharing
> --
> "So you have to trust that the dots will somehow connect in your future."
>

Reply via email to