Thanks Feng, the catalog modification listener is only used to report
read-only ddl information to other components or systems.

> 1. Will an exception thrown by the listener affect the normal execution
process?

Users need to handle the exception in the listener themselves. Many DDLs
such as drop tables and alter tables cannot be rolled back, Flink cannot
handle these exceptions for the listener. It will cause the operation to
exit if an exception is thrown, but the executed DDL will be successful.

> 2. What is the order of execution? Is the listener executed first or are
specific operations executed first?  If I want to perform DDL permission
verification(such as integrating with Ranger based on the listener) , is
that possible?

The listener will be notified to report catalog modification after DDLs are
successful, so you can not do permission verification for DDL in the
listener. As mentioned above, Flink will not roll back the DDL even when
the listener throws an exception. I think permission verification is
another issue and can be discussed separately.


Best,
Shammon FY

On Tue, May 30, 2023 at 1:07 AM Feng Jin <jinfeng1...@gmail.com> wrote:

> Hi, Shammon
>
> Thanks for driving this Flip, [Support Customized Job Meta Data Listener]
> will  make it easier for Flink to collect lineage information.
> I fully agree with the overall solution and have a small question:
>
> 1. Will an exception thrown by the listener affect the normal execution
> process?
>
> 2. What is the order of execution? Is the listener executed first or are
> specific operations executed first?  If I want to perform DDL permission
> verification(such as integrating with Ranger based on the listener) , is
> that possible?
>
>
> Best,
> Feng
>
> On Fri, May 26, 2023 at 4:09 PM Shammon FY <zjur...@gmail.com> wrote:
>
> > Hi devs,
> >
> > We would like to bring up a discussion about FLIP-294: Support Customized
> > Job Meta Data Listener[1]. We have had several discussions with Jark Wu,
> > Leonard Xu, Dong Lin, Qingsheng Ren and Poorvank about the functions and
> > interfaces, and thanks for their valuable advice.
> > The overall job and connector information is divided into metadata and
> > lineage, this FLIP focuses on metadata and lineage will be discussed in
> > another FLIP in the future. In this FLIP we want to add a customized
> > listener in Flink to report catalog modifications to external metadata
> > systems such as datahub[2] or atlas[3]. Users can view the specific
> > information of connectors such as source and sink for Flink jobs in these
> > systems, including fields, watermarks, partitions, etc.
> >
> > Looking forward to hearing from you, thanks.
> >
> >
> > [1]
> >
> >
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-294%3A+Support+Customized+Job+Meta+Data+Listener
> > [2] https://datahub.io/
> > [3] https://atlas.apache.org/#/
> >
>

Reply via email to