Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-07-05 Thread Matthias Pohl
; > > discussion. According to a recent report[2] on the state of Java,
> it
> > > > might
> > > > > be a little early to drop support for Java 11. We can discuss this
> > > > > separately.
> > > > >
> > > > > Thanks,
> > > > >
> > > > > Jiangjie (Becket) Qin
> > > > >
> > > > > [1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> > > > > [2]
> > > > >
> > > > >
> > > >
> >
> https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf
> > > > >
> > > > > On Tue, Jun 25, 2024 at 4:58 AM David Radley <
> > david_rad...@uk.ibm.com>
> > > > > wrote:
> > > > >
> > > > > > Hi,
> > > > > > I think this is a great question. I am not sure if this has been
> > > > covered
> > > > > > elsewhere, but it would be good to be clear how this effects the
> > > > > connectors
> > > > > > and operator repos, with potentially v1 and v2 oriented new
> > featuresI
> > > > > > suspect this will be a connector by connector investigation. I am
> > > > > thinking
> > > > > > connectors with Hadoop eco-system dependencies (e.g. Paimon)
> which
> > may
> > > > > not
> > > > > > work nicely with Java 17,
> > > > > >
> > > > > >  Kind regards, David.
> > > > > >
> > > > > >
> > > > > > From: Matthias Pohl 
> > > > > > Date: Tuesday, 25 June 2024 at 09:57
> > > > > > To: dev@flink.apache.org 
> > > > > > Cc: Xintong Song ,
> martijnvis...@apache.org
> > <
> > > > > > martijnvis...@apache.org>, imj...@gmail.com ,
> > > > > > becket@gmail.com 
> > > > > > Subject: [EXTERNAL] [2.0] How to handle on-going feature
> > development in
> > > > > > Flink 2.0?
> > > > > > Hi 2.0 release managers,
> > > > > > With the 1.20 release branch being cut [1], master is now
> > referring to
> > > > > > 2.0-SNAPSHOT. I remember that, initially, the community had the
> > idea of
> > > > > > keeping the 2.0 release as small as possible focusing on API
> > changes
> > > > [2].
> > > > > >
> > > > > > What does this mean for new features? I guess blocking them until
> > 2.0
> > > > is
> > > > > > released is not a good option. Shall we treat new features as
> > > > > > "nice-to-have" items as documented in the 2.0 release overview
> [3]
> > and
> > > > > > merge them to master like it was done for minor releases in the
> > past?
> > > > Do
> > > > > > you want to add a separate section in the 2.0 release overview
> [3]
> > to
> > > > > list
> > > > > > these new features (e.g. FLIP-461 [4]) separately? That might
> help
> > to
> > > > > > manage planned 2.0 deprecations/API removal and new features
> > > > separately.
> > > > > Or
> > > > > > do you have a different process in mind?
> > > > > >
> > > > > > Apologies if this was already discussed somewhere. I didn't
> manage
> > to
> > > > > find
> > > > > > anything related to this topic.
> > > > > >
> > > > > > Best,
> > > > > > Matthias
> > > > > >
> > > > > > [1]
> > https://lists.apache.org/thread/mwnfd7o10xo6ynx0n640pw9v2opbkm8l
> > > > > > [2]
> > https://lists.apache.org/thread/b8w5cx0qqbwzzklyn5xxf54vw9ymys1c
> > > > > > [3]
> https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> > > > > > [4]
> > > > > >
> > > > > >
> > > > >
> > > >
> >
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-461%3A+Synchronize+rescaling+with+checkpoint+creation+to+minimize+reprocessing+for+the+AdaptiveScheduler
> > > > > >
> > > > > > Unless otherwise stated above:
> > > > > >
> > > > > > IBM United Kingdom Limited
> > > > > > Registered in England and Wales with number 741598
> > > > > > Registered office: PO Box 41, North Harbour, Portsmouth, Hants.
> > PO6 3AU
> > > > > >
> > > > >
> > > >
> >
>


Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-06-26 Thread Zakelly Lan
+1 for a preview before the formal release. It would help us find issues in
advance.


Best,
Zakelly

On Wed, Jun 26, 2024 at 4:44 PM Jingsong Li  wrote:

> +1 to release a preview version.
>
> Best,
> Jingsong
>
> On Wed, Jun 26, 2024 at 10:12 AM Jark Wu  wrote:
> >
> > I also think this should not block new feature development.
> > Having "nice-to-have" and "must-to-have" tags on the FLIPs is a good
> idea.
> >
> > For the downstream projects, I think we need to release a 2.0 preview
> > version one or
> > two months before the formal release. This can leave some time for the
> > downstream
> > projects to integrate and provide feedback. So we can fix the problems
> > (e.g. unexpected
> > breaking changes, Java versions) before 2.0.
> >
> > Best,
> > Jark
> >
> > On Wed, 26 Jun 2024 at 09:39, Xintong Song 
> wrote:
> >
> > > I also don't think we should block new feature development until 2.0.
> From
> > > my understanding, the new major release is no different from the
> regular
> > > minor releases for new features.
> > >
> > > I think tracking new features, either as nice-to-have items or in a
> > > separate list, is necessary. It helps us understand what's going on in
> the
> > > release cycle, and what to announce and promote. Maybe we should start
> a
> > > discussion on updating the 2.0 item list, to 1) collect new items that
> are
> > > proposed / initiated after the original list being created and 2) to
> remove
> > > some items that are no longer suitable. I'll discuss this with the
> other
> > > release managers first.
> > >
> > > For the connectors and operators, I think it depends on whether they
> depend
> > > on any deprecated APIs or internal implementations of Flink. Ideally,
> > > all @Public APIs and @PublicEvolving APIs that we plan to change /
> remove
> > > should have been deprecated in 1.19 and 1.20 respectively. That means
> if
> > > the connectors and operators only use non-deprecated @Puclib
> > > and @PublicEvolving APIs in 1.20, hopefully there should not be any
> > > problems upgrading to 2.0.
> > >
> > > Best,
> > >
> > > Xintong
> > >
> > >
> > >
> > > On Wed, Jun 26, 2024 at 5:20 AM Becket Qin 
> wrote:
> > >
> > > > Thanks for the question, Matthias.
> > > >
> > > > My two cents, I don't think we are blocking new feature development.
> My
> > > > understanding is that the community will just prioritize removing
> > > > deprecated APIs in the 2.0 dev cycle. Because of that, it is possible
> > > that
> > > > some new feature development may slow down a little bit since some
> > > > contributors may be working on the must-have features for 2.0. But
> policy
> > > > wise, I don't see a reason to block the new feature development for
> the
> > > 2.0
> > > > release feature plan[1].
> > > >
> > > > Process wise, I like your idea of adding the new features as
> nice-to-have
> > > > in the 2.0 feature list.
> > > >
> > > > Re: David,
> > > > Given it is a major version bump. It is possible that some of the
> > > > downstream projects (e.g. connectors, Paimon, etc) will have to see
> if a
> > > > major version bump is also needed there. And it is probably going to
> be
> > > > decisions made on a per-project basis.
> > > > Regarding the Java version specifically, this probably worth a
> separate
> > > > discussion. According to a recent report[2] on the state of Java, it
> > > might
> > > > be a little early to drop support for Java 11. We can discuss this
> > > > separately.
> > > >
> > > > Thanks,
> > > >
> > > > Jiangjie (Becket) Qin
> > > >
> > > > [1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> > > > [2]
> > > >
> > > >
> > >
> https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf
> > > >
> > > > On Tue, Jun 25, 2024 at 4:58 AM David Radley <
> david_rad...@uk.ibm.com>
> > > > wrote:
> > > >
> > > > > Hi,
> > > > > I think this is a great question. I am not sure if this has been
> > > covered
> > > > > elsewhere, but it

Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-06-26 Thread Jingsong Li
+1 to release a preview version.

Best,
Jingsong

On Wed, Jun 26, 2024 at 10:12 AM Jark Wu  wrote:
>
> I also think this should not block new feature development.
> Having "nice-to-have" and "must-to-have" tags on the FLIPs is a good idea.
>
> For the downstream projects, I think we need to release a 2.0 preview
> version one or
> two months before the formal release. This can leave some time for the
> downstream
> projects to integrate and provide feedback. So we can fix the problems
> (e.g. unexpected
> breaking changes, Java versions) before 2.0.
>
> Best,
> Jark
>
> On Wed, 26 Jun 2024 at 09:39, Xintong Song  wrote:
>
> > I also don't think we should block new feature development until 2.0. From
> > my understanding, the new major release is no different from the regular
> > minor releases for new features.
> >
> > I think tracking new features, either as nice-to-have items or in a
> > separate list, is necessary. It helps us understand what's going on in the
> > release cycle, and what to announce and promote. Maybe we should start a
> > discussion on updating the 2.0 item list, to 1) collect new items that are
> > proposed / initiated after the original list being created and 2) to remove
> > some items that are no longer suitable. I'll discuss this with the other
> > release managers first.
> >
> > For the connectors and operators, I think it depends on whether they depend
> > on any deprecated APIs or internal implementations of Flink. Ideally,
> > all @Public APIs and @PublicEvolving APIs that we plan to change / remove
> > should have been deprecated in 1.19 and 1.20 respectively. That means if
> > the connectors and operators only use non-deprecated @Puclib
> > and @PublicEvolving APIs in 1.20, hopefully there should not be any
> > problems upgrading to 2.0.
> >
> > Best,
> >
> > Xintong
> >
> >
> >
> > On Wed, Jun 26, 2024 at 5:20 AM Becket Qin  wrote:
> >
> > > Thanks for the question, Matthias.
> > >
> > > My two cents, I don't think we are blocking new feature development. My
> > > understanding is that the community will just prioritize removing
> > > deprecated APIs in the 2.0 dev cycle. Because of that, it is possible
> > that
> > > some new feature development may slow down a little bit since some
> > > contributors may be working on the must-have features for 2.0. But policy
> > > wise, I don't see a reason to block the new feature development for the
> > 2.0
> > > release feature plan[1].
> > >
> > > Process wise, I like your idea of adding the new features as nice-to-have
> > > in the 2.0 feature list.
> > >
> > > Re: David,
> > > Given it is a major version bump. It is possible that some of the
> > > downstream projects (e.g. connectors, Paimon, etc) will have to see if a
> > > major version bump is also needed there. And it is probably going to be
> > > decisions made on a per-project basis.
> > > Regarding the Java version specifically, this probably worth a separate
> > > discussion. According to a recent report[2] on the state of Java, it
> > might
> > > be a little early to drop support for Java 11. We can discuss this
> > > separately.
> > >
> > > Thanks,
> > >
> > > Jiangjie (Becket) Qin
> > >
> > > [1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> > > [2]
> > >
> > >
> > https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf
> > >
> > > On Tue, Jun 25, 2024 at 4:58 AM David Radley 
> > > wrote:
> > >
> > > > Hi,
> > > > I think this is a great question. I am not sure if this has been
> > covered
> > > > elsewhere, but it would be good to be clear how this effects the
> > > connectors
> > > > and operator repos, with potentially v1 and v2 oriented new featuresI
> > > > suspect this will be a connector by connector investigation. I am
> > > thinking
> > > > connectors with Hadoop eco-system dependencies (e.g. Paimon) which may
> > > not
> > > > work nicely with Java 17,
> > > >
> > > >  Kind regards, David.
> > > >
> > > >
> > > > From: Matthias Pohl 
> > > > Date: Tuesday, 25 June 2024 at 09:57
> > > > To: dev@flink.apache.org 
> > > > Cc: Xintong Song , martijnvis...@apache.org <
>

Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-06-25 Thread Jark Wu
I also think this should not block new feature development.
Having "nice-to-have" and "must-to-have" tags on the FLIPs is a good idea.

For the downstream projects, I think we need to release a 2.0 preview
version one or
two months before the formal release. This can leave some time for the
downstream
projects to integrate and provide feedback. So we can fix the problems
(e.g. unexpected
breaking changes, Java versions) before 2.0.

Best,
Jark

On Wed, 26 Jun 2024 at 09:39, Xintong Song  wrote:

> I also don't think we should block new feature development until 2.0. From
> my understanding, the new major release is no different from the regular
> minor releases for new features.
>
> I think tracking new features, either as nice-to-have items or in a
> separate list, is necessary. It helps us understand what's going on in the
> release cycle, and what to announce and promote. Maybe we should start a
> discussion on updating the 2.0 item list, to 1) collect new items that are
> proposed / initiated after the original list being created and 2) to remove
> some items that are no longer suitable. I'll discuss this with the other
> release managers first.
>
> For the connectors and operators, I think it depends on whether they depend
> on any deprecated APIs or internal implementations of Flink. Ideally,
> all @Public APIs and @PublicEvolving APIs that we plan to change / remove
> should have been deprecated in 1.19 and 1.20 respectively. That means if
> the connectors and operators only use non-deprecated @Puclib
> and @PublicEvolving APIs in 1.20, hopefully there should not be any
> problems upgrading to 2.0.
>
> Best,
>
> Xintong
>
>
>
> On Wed, Jun 26, 2024 at 5:20 AM Becket Qin  wrote:
>
> > Thanks for the question, Matthias.
> >
> > My two cents, I don't think we are blocking new feature development. My
> > understanding is that the community will just prioritize removing
> > deprecated APIs in the 2.0 dev cycle. Because of that, it is possible
> that
> > some new feature development may slow down a little bit since some
> > contributors may be working on the must-have features for 2.0. But policy
> > wise, I don't see a reason to block the new feature development for the
> 2.0
> > release feature plan[1].
> >
> > Process wise, I like your idea of adding the new features as nice-to-have
> > in the 2.0 feature list.
> >
> > Re: David,
> > Given it is a major version bump. It is possible that some of the
> > downstream projects (e.g. connectors, Paimon, etc) will have to see if a
> > major version bump is also needed there. And it is probably going to be
> > decisions made on a per-project basis.
> > Regarding the Java version specifically, this probably worth a separate
> > discussion. According to a recent report[2] on the state of Java, it
> might
> > be a little early to drop support for Java 11. We can discuss this
> > separately.
> >
> > Thanks,
> >
> > Jiangjie (Becket) Qin
> >
> > [1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> > [2]
> >
> >
> https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf
> >
> > On Tue, Jun 25, 2024 at 4:58 AM David Radley 
> > wrote:
> >
> > > Hi,
> > > I think this is a great question. I am not sure if this has been
> covered
> > > elsewhere, but it would be good to be clear how this effects the
> > connectors
> > > and operator repos, with potentially v1 and v2 oriented new featuresI
> > > suspect this will be a connector by connector investigation. I am
> > thinking
> > > connectors with Hadoop eco-system dependencies (e.g. Paimon) which may
> > not
> > > work nicely with Java 17,
> > >
> > >  Kind regards, David.
> > >
> > >
> > > From: Matthias Pohl 
> > > Date: Tuesday, 25 June 2024 at 09:57
> > > To: dev@flink.apache.org 
> > > Cc: Xintong Song , martijnvis...@apache.org <
> > > martijnvis...@apache.org>, imj...@gmail.com ,
> > > becket@gmail.com 
> > > Subject: [EXTERNAL] [2.0] How to handle on-going feature development in
> > > Flink 2.0?
> > > Hi 2.0 release managers,
> > > With the 1.20 release branch being cut [1], master is now referring to
> > > 2.0-SNAPSHOT. I remember that, initially, the community had the idea of
> > > keeping the 2.0 release as small as possible focusing on API changes
> [2].
> > >
> > > What does this mean for new features? I guess blocking them un

Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-06-25 Thread Xintong Song
I also don't think we should block new feature development until 2.0. From
my understanding, the new major release is no different from the regular
minor releases for new features.

I think tracking new features, either as nice-to-have items or in a
separate list, is necessary. It helps us understand what's going on in the
release cycle, and what to announce and promote. Maybe we should start a
discussion on updating the 2.0 item list, to 1) collect new items that are
proposed / initiated after the original list being created and 2) to remove
some items that are no longer suitable. I'll discuss this with the other
release managers first.

For the connectors and operators, I think it depends on whether they depend
on any deprecated APIs or internal implementations of Flink. Ideally,
all @Public APIs and @PublicEvolving APIs that we plan to change / remove
should have been deprecated in 1.19 and 1.20 respectively. That means if
the connectors and operators only use non-deprecated @Puclib
and @PublicEvolving APIs in 1.20, hopefully there should not be any
problems upgrading to 2.0.

Best,

Xintong



On Wed, Jun 26, 2024 at 5:20 AM Becket Qin  wrote:

> Thanks for the question, Matthias.
>
> My two cents, I don't think we are blocking new feature development. My
> understanding is that the community will just prioritize removing
> deprecated APIs in the 2.0 dev cycle. Because of that, it is possible that
> some new feature development may slow down a little bit since some
> contributors may be working on the must-have features for 2.0. But policy
> wise, I don't see a reason to block the new feature development for the 2.0
> release feature plan[1].
>
> Process wise, I like your idea of adding the new features as nice-to-have
> in the 2.0 feature list.
>
> Re: David,
> Given it is a major version bump. It is possible that some of the
> downstream projects (e.g. connectors, Paimon, etc) will have to see if a
> major version bump is also needed there. And it is probably going to be
> decisions made on a per-project basis.
> Regarding the Java version specifically, this probably worth a separate
> discussion. According to a recent report[2] on the state of Java, it might
> be a little early to drop support for Java 11. We can discuss this
> separately.
>
> Thanks,
>
> Jiangjie (Becket) Qin
>
> [1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> [2]
>
> https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf
>
> On Tue, Jun 25, 2024 at 4:58 AM David Radley 
> wrote:
>
> > Hi,
> > I think this is a great question. I am not sure if this has been covered
> > elsewhere, but it would be good to be clear how this effects the
> connectors
> > and operator repos, with potentially v1 and v2 oriented new featuresI
> > suspect this will be a connector by connector investigation. I am
> thinking
> > connectors with Hadoop eco-system dependencies (e.g. Paimon) which may
> not
> > work nicely with Java 17,
> >
> >  Kind regards, David.
> >
> >
> > From: Matthias Pohl 
> > Date: Tuesday, 25 June 2024 at 09:57
> > To: dev@flink.apache.org 
> > Cc: Xintong Song , martijnvis...@apache.org <
> > martijnvis...@apache.org>, imj...@gmail.com ,
> > becket@gmail.com 
> > Subject: [EXTERNAL] [2.0] How to handle on-going feature development in
> > Flink 2.0?
> > Hi 2.0 release managers,
> > With the 1.20 release branch being cut [1], master is now referring to
> > 2.0-SNAPSHOT. I remember that, initially, the community had the idea of
> > keeping the 2.0 release as small as possible focusing on API changes [2].
> >
> > What does this mean for new features? I guess blocking them until 2.0 is
> > released is not a good option. Shall we treat new features as
> > "nice-to-have" items as documented in the 2.0 release overview [3] and
> > merge them to master like it was done for minor releases in the past? Do
> > you want to add a separate section in the 2.0 release overview [3] to
> list
> > these new features (e.g. FLIP-461 [4]) separately? That might help to
> > manage planned 2.0 deprecations/API removal and new features separately.
> Or
> > do you have a different process in mind?
> >
> > Apologies if this was already discussed somewhere. I didn't manage to
> find
> > anything related to this topic.
> >
> > Best,
> > Matthias
> >
> > [1] https://lists.apache.org/thread/mwnfd7o10xo6ynx0n640pw9v2opbkm8l
> > [2] https://lists.apache.org/thread/b8w5cx0qqbwzzklyn5xxf54vw9ymys1c
> > [3] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> > [4]
> >
> >
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-461%3A+Synchronize+rescaling+with+checkpoint+creation+to+minimize+reprocessing+for+the+AdaptiveScheduler
> >
> > Unless otherwise stated above:
> >
> > IBM United Kingdom Limited
> > Registered in England and Wales with number 741598
> > Registered office: PO Box 41, North Harbour, Portsmouth, Hants. PO6 3AU
> >
>


Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-06-25 Thread Becket Qin
Thanks for the question, Matthias.

My two cents, I don't think we are blocking new feature development. My
understanding is that the community will just prioritize removing
deprecated APIs in the 2.0 dev cycle. Because of that, it is possible that
some new feature development may slow down a little bit since some
contributors may be working on the must-have features for 2.0. But policy
wise, I don't see a reason to block the new feature development for the 2.0
release feature plan[1].

Process wise, I like your idea of adding the new features as nice-to-have
in the 2.0 feature list.

Re: David,
Given it is a major version bump. It is possible that some of the
downstream projects (e.g. connectors, Paimon, etc) will have to see if a
major version bump is also needed there. And it is probably going to be
decisions made on a per-project basis.
Regarding the Java version specifically, this probably worth a separate
discussion. According to a recent report[2] on the state of Java, it might
be a little early to drop support for Java 11. We can discuss this
separately.

Thanks,

Jiangjie (Becket) Qin

[1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
[2]
https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf

On Tue, Jun 25, 2024 at 4:58 AM David Radley 
wrote:

> Hi,
> I think this is a great question. I am not sure if this has been covered
> elsewhere, but it would be good to be clear how this effects the connectors
> and operator repos, with potentially v1 and v2 oriented new featuresI
> suspect this will be a connector by connector investigation. I am thinking
> connectors with Hadoop eco-system dependencies (e.g. Paimon) which may not
> work nicely with Java 17,
>
>  Kind regards, David.
>
>
> From: Matthias Pohl 
> Date: Tuesday, 25 June 2024 at 09:57
> To: dev@flink.apache.org 
> Cc: Xintong Song , martijnvis...@apache.org <
> martijnvis...@apache.org>, imj...@gmail.com ,
> becket@gmail.com 
> Subject: [EXTERNAL] [2.0] How to handle on-going feature development in
> Flink 2.0?
> Hi 2.0 release managers,
> With the 1.20 release branch being cut [1], master is now referring to
> 2.0-SNAPSHOT. I remember that, initially, the community had the idea of
> keeping the 2.0 release as small as possible focusing on API changes [2].
>
> What does this mean for new features? I guess blocking them until 2.0 is
> released is not a good option. Shall we treat new features as
> "nice-to-have" items as documented in the 2.0 release overview [3] and
> merge them to master like it was done for minor releases in the past? Do
> you want to add a separate section in the 2.0 release overview [3] to list
> these new features (e.g. FLIP-461 [4]) separately? That might help to
> manage planned 2.0 deprecations/API removal and new features separately. Or
> do you have a different process in mind?
>
> Apologies if this was already discussed somewhere. I didn't manage to find
> anything related to this topic.
>
> Best,
> Matthias
>
> [1] https://lists.apache.org/thread/mwnfd7o10xo6ynx0n640pw9v2opbkm8l
> [2] https://lists.apache.org/thread/b8w5cx0qqbwzzklyn5xxf54vw9ymys1c
> [3] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> [4]
>
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-461%3A+Synchronize+rescaling+with+checkpoint+creation+to+minimize+reprocessing+for+the+AdaptiveScheduler
>
> Unless otherwise stated above:
>
> IBM United Kingdom Limited
> Registered in England and Wales with number 741598
> Registered office: PO Box 41, North Harbour, Portsmouth, Hants. PO6 3AU
>


Re: [2.0] How to handle on-going feature development in Flink 2.0?

2024-06-25 Thread David Radley
Hi,
I think this is a great question. I am not sure if this has been covered 
elsewhere, but it would be good to be clear how this effects the connectors and 
operator repos, with potentially v1 and v2 oriented new featuresI suspect this 
will be a connector by connector investigation. I am thinking connectors with 
Hadoop eco-system dependencies (e.g. Paimon) which may not work nicely with 
Java 17,

 Kind regards, David.


From: Matthias Pohl 
Date: Tuesday, 25 June 2024 at 09:57
To: dev@flink.apache.org 
Cc: Xintong Song , martijnvis...@apache.org 
, imj...@gmail.com , 
becket@gmail.com 
Subject: [EXTERNAL] [2.0] How to handle on-going feature development in Flink 
2.0?
Hi 2.0 release managers,
With the 1.20 release branch being cut [1], master is now referring to
2.0-SNAPSHOT. I remember that, initially, the community had the idea of
keeping the 2.0 release as small as possible focusing on API changes [2].

What does this mean for new features? I guess blocking them until 2.0 is
released is not a good option. Shall we treat new features as
"nice-to-have" items as documented in the 2.0 release overview [3] and
merge them to master like it was done for minor releases in the past? Do
you want to add a separate section in the 2.0 release overview [3] to list
these new features (e.g. FLIP-461 [4]) separately? That might help to
manage planned 2.0 deprecations/API removal and new features separately. Or
do you have a different process in mind?

Apologies if this was already discussed somewhere. I didn't manage to find
anything related to this topic.

Best,
Matthias

[1] https://lists.apache.org/thread/mwnfd7o10xo6ynx0n640pw9v2opbkm8l
[2] https://lists.apache.org/thread/b8w5cx0qqbwzzklyn5xxf54vw9ymys1c
[3] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
[4]
https://cwiki.apache.org/confluence/display/FLINK/FLIP-461%3A+Synchronize+rescaling+with+checkpoint+creation+to+minimize+reprocessing+for+the+AdaptiveScheduler

Unless otherwise stated above:

IBM United Kingdom Limited
Registered in England and Wales with number 741598
Registered office: PO Box 41, North Harbour, Portsmouth, Hants. PO6 3AU


[2.0] How to handle on-going feature development in Flink 2.0?

2024-06-25 Thread Matthias Pohl
Hi 2.0 release managers,
With the 1.20 release branch being cut [1], master is now referring to
2.0-SNAPSHOT. I remember that, initially, the community had the idea of
keeping the 2.0 release as small as possible focusing on API changes [2].

What does this mean for new features? I guess blocking them until 2.0 is
released is not a good option. Shall we treat new features as
"nice-to-have" items as documented in the 2.0 release overview [3] and
merge them to master like it was done for minor releases in the past? Do
you want to add a separate section in the 2.0 release overview [3] to list
these new features (e.g. FLIP-461 [4]) separately? That might help to
manage planned 2.0 deprecations/API removal and new features separately. Or
do you have a different process in mind?

Apologies if this was already discussed somewhere. I didn't manage to find
anything related to this topic.

Best,
Matthias

[1] https://lists.apache.org/thread/mwnfd7o10xo6ynx0n640pw9v2opbkm8l
[2] https://lists.apache.org/thread/b8w5cx0qqbwzzklyn5xxf54vw9ymys1c
[3] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
[4]
https://cwiki.apache.org/confluence/display/FLINK/FLIP-461%3A+Synchronize+rescaling+with+checkpoint+creation+to+minimize+reprocessing+for+the+AdaptiveScheduler