Hi,

Very thanks for @Timo to initiate the discussion! 

I would also +1 for providing some informations to users via annotations 
or documents in advanced to not suprise users before we actually remove the 
legacy code. 
If we finally decide to change one functionality that user could sense, perhaps 
one 
premise is that Flink has provided a replacement for that one and users could 
transfer their 
applications easily. Then we might also consider have one dedicated 
documentation page 
to list the functionalities to change and how users could do the transfer. 

To make the decision of whether to remove some legacy code, we might also 
consider to have a survey 
like the one we did for mesos support [1] to see how this functionality is used.

Best,
 Yun


[1] 
https://lists.apache.org/thread.html/r139b11190a6d1f09c9e44d5fa985fd8d310347e66d2324ec1f0c2d87%40%3Cuser.flink.apache.org%3E



 ------------------Original Mail ------------------
Sender:Piotr Nowojski <pnowoj...@apache.org>
Send Date:Mon Jan 18 18:23:36 2021
Recipients:dev <dev@flink.apache.org>
Subject:Re: [DISCUSS] Dealing with deprecated and legacy code in Flink
Hi Timo,

Thanks for starting this discussion. I'm not sure how we should approach
this topic and what should be our final recommendation, but definitely
clearing up a couple of things would be helpful.

For starters, I agree it would be good to have some more information,
besides just "@Deprecated" annotations. Is it possible to extend
annotations with informations like:
- from which version was it deprecated
- when is it planned to be removed (we could always mark `2.0` as "never"
;) )
- add maybe some pre/post release step of verifying that removal has
actually happened?

?

On the other hand, I think it's very important to maintain backward
compatibility with Flink as much as possible. As a developer I don't
like dealing with this, but as a user I hate dealing with incompatible
upgrades even more. So all in all, I would be in favour of putting more
effort not in deprecating and removing APIs, but making sure that they are
stable.

Stephan Ewan also raised a point sometime ago, that in the recent past, we
developed a habit of marking everything as `@Experimental` or
`@PublicEvolving` and leaving it as that forever. Maybe we should also
include deadlines (2 releases since introduction?) for changing
`@Experimental`/`@PublicEvolving` into `@Public` in this kind of
guidelines/automated checks?

Piotrek

pt., 15 sty 2021 o 13:56 Timo Walther <twal...@apache.org> napisaƂ(a):

> Hi everyone,
>
> I would like to start a discussion how we treat deprecated and legacy
> code in Flink in the future. During the last years, our code base has
> grown quite a bit and a couple of interfaces and components have been
> reworked on the way.
>
> I'm sure each component has a few legacy parts that are waiting for
> removal. Apart from keeping outdated API around for a couple of releases
> until users have updated their code, it is also often easier to just put
> a @Deprecation annotation and postpone the actual change.
>
> When looking at the current code, we have duplicated SQL planners,
> duplicated APIs (DataSet/DataStream), duplicated source/sink interfaces,
> outdated connectors (Elasticsearch 5?) and dependencies (Scala 2.11?).
>
> I'm wondering whether we should come up with some legacy/deprecation
> guidelines for the future.
>
> Some examples:
>
> - I could imagine new Flink-specific annotations for documenting (in
> code) in which version an interface was deprecated and when the planned
> removal should take place.
> - Or guidelines that we drop a connector when the external project does
> not maintain the version for 6 months etc.
>
> Plannable removal dates should also help users to not be surprised when
> a connector or Scala version is not supported anymore.
>
> What do you think? I'm very happy to hear more opinions.
>
> Regards,
> Timo
>
>
>
>
>
>

Reply via email to