Hi Martijn,

Thanks for your reply and attention.

1. As I read Nick's report here
https://issues.apache.org/jira/browse/FLINK-13414?focusedCommentId=17257763&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17257763
Scala maintainers were blocked by Flink's source code inability to migrate
from Scala 2.11 to newer versions easily. One strong reason is extensive
Scala Macros usage in Flink Scala API, so that eventually few other Scala
users developed 3-rd party Flink Wrappers on top of Java API once it became
possible.

2. Scala wrapper is still needed due to the Scala type system and object
serialization in Flink. You can not easily searilie Scala product type by
ONLY using Java API. Scala collection types also differ from standard Java
collections. If that would not be needed, I of course would not even start
this discussion and continue to use Java API from Scala. Same principles of
Scala and Java classes separation you can find in Akka and Apache Spark
code bases.

3. Another point I did not mention in the first email, the Scala code
examples look much more readable in Flink docs thanks to concise language
syntax. It would be very helpful to keep them in Flink and make sure they
work with Scala 2.13. and Scala 3. We would need to make sure if a user
uses Scala code example from Flink docs, it works with Scala latest version
without any issue. Otherwise, Scala users will have issues if they won't
use an extra Scala wrapper for Java API. If that Scala wrapper is not an
official part of Flink project, then it will be unsafe to use Scala at all.
Günter has mentioned about it in his reply as well.

Best regards,
Alexey

On Mon, Apr 17, 2023 at 9:27 AM Martijn Visser <martijnvis...@apache.org>
wrote:

> Hi Alexey,
>
> > Taking into account my Scala experience for the last 8 years, I predict
> these wrappers will eventually be abandoned, unless such a Scala library is
> a part of some bigger community like ASF.
>
> For the past couple of years, there have been no maintainers for Scala in
> the Flink community. It was one of the reasons to deprecate the Scala APIs.
> Given that the wrappers don't seem to have taken off outside of Flink, why
> would moving them under the AS resolve this?
>
> > Also, non-official Scala API will lead people to play safe and choose
> Java API only, even if they did want that at the beginning.
>
> Why would that be a problem? Wouldn't the fact that there are no
> maintainers for the Scala wrappers actually indicate that Scala users are
> actually fine with using the Java APIs, because else there would have been
> improvements made towards the Scala wrappers?
>
> Best regards,
>
> Martijn
>
> On Sun, Apr 16, 2023 at 11:47 AM David Morávek <d...@apache.org> wrote:
>
>> cc dev@f.a.o
>>
>> On Sun, Apr 16, 2023 at 11:42 AM David Morávek <d...@apache.org> wrote:
>>
>> > Hi Alexey,
>> >
>> > I'm a bit skeptical because, looking at the project, I see a couple of
>> red
>> > flags:
>> >
>> > - The project is inactive. The last release and commit are both from the
>> > last May.
>> > - The project has not been adapted for the last two Flink versions,
>> which
>> > signals a lack of users.
>> > - All commits are by a single person, which could mean that there is no
>> > community around the project.
>> > - There was no external contribution (except the Scala bot).
>> > - There is no fork of the project (except the Scala bot).
>> >
>> > >  As I know, FIndify does not want or cannot maintain this library.
>> >
>> > Who are the users of the library? I'd assume Findify no longer uses it
>> if
>> > they're abandoning it.
>> >
>> > > which would be similar to the StateFun
>> >
>> > We're currently dealing with a lack of maintainers for StateFun, so we
>> > should have a solid building ground around the project to avoid the same
>> > issue.
>> >
>> >
>> > I think there is value in having a modern Scala API, but we should have
>> a
>> > bigger plan to address the future of Flink Scala APIs than importing an
>> > unmaintained library and calling it a day. I suggest starting a thread
>> on
>> > the dev ML and concluding the overall plan first.
>> >
>> > Best,
>> > D.
>> >
>> > On Sun, Apr 16, 2023 at 10:48 AM guenterh.lists <
>> guenterh.li...@bluewin.ch>
>> > wrote:
>> >
>> >> Hello Alexey
>> >>
>> >> Thank you for your initiative and your suggestion!
>> >>
>> >> I can only fully support the following statements in your email:
>> >>
>> >>  >Taking into account my Scala experience for the last 8 years, I
>> >> predict these wrappers will eventually be abandoned, unless such a
>> Scala
>> >> library is a part of some bigger community like ASF.
>> >>  >Also, non-official Scala API will lead people to play safe and choose
>> >> Java API only, even if they didn't want that at the beginning.
>> >>
>> >> Second sentence is my current state.
>> >>
>> >>  From my point of view it would be very unfortunate if the Flink
>> project
>> >> would lose the Scala API and thus the integration of concise, flexible
>> >> and future-oriented language constructs of the Scala language (and
>> >> further development of version 3).
>> >>
>> >> Documentation of the API is essential. I would be interested to support
>> >> this efforts.
>> >>
>> >> Best wishes
>> >>
>> >> Günter
>> >>
>> >>
>> >> On 13.04.23 15:39, Alexey Novakov via user wrote:
>> >> > Hello Flink PMCs and Flink Scala Users,
>> >> >
>> >> > I would like to propose an idea to take the 3rd party Scala API
>> >> > findify/flink-scala-api <https://github.com/findify/flink-scala-api>
>> >> > project into the Apache Flink organization.
>> >> >
>> >> > *Motivation *
>> >> >
>> >> > The Scala-free Flink idea was finally implemented by the 1.15 release
>> >> and
>> >> > allowed Flink users to bring their own Scala version and use it via
>> the
>> >> > Flink Java API. See blog-post here: Scala Free in One Fifteen
>> >> > <https://flink.apache.org/2022/02/22/scala-free-in-one-fifteen/>.
>> Also,
>> >> > existing Flink Scala API will be deprecated, because it is too hard
>> to
>> >> > upgrade it to Scala 2.13 or 3.
>> >> >
>> >> > Taking into account my Scala experience for the last 8 years, I
>> predict
>> >> > these wrappers will eventually be abandoned, unless such a Scala
>> >> library is
>> >> > a part of some bigger community like ASF.
>> >> > Also, non-official Scala API will lead people to play safe and choose
>> >> Java
>> >> > API only, even if they did want that at the beginning.
>> >> >
>> >> > https://github.com/findify/flink-scala-api has already advanced and
>> >> > implemented Scala support for 2.13 and 3 versions on top of Flink
>> Java
>> >> API.
>> >> > As I know, FIndify does not want or does not have a capacity to
>> maintain
>> >> > this library. I propose to fork this great library and create a new
>> >> Flink
>> >> > project with its own version and build process (SBT, not Maven),
>> which
>> >> > would be similar to the StateFun or FlinkML projects.
>> >> >
>> >> > *Proposal *
>> >> >
>> >> > 1. Create a fork of findify/flink-scala-api and host in Apache Flink
>> Git
>> >> > space (PMCs please advise).
>> >> > 2. I and Roman
>> >> > <
>> >>
>> https://issues.apache.org/jira/secure/ViewProfile.jspa?name=rgrebennikov>
>> >> > would
>> >> > be willing to maintain this library in future for the next several
>> >> years.
>> >> > Further, we believe it will live on its own.
>> >> > 3. Flink Docs: PMCs, we need your guidelines here. One way I see is
>> to
>> >> > create new documentation in a similar way as StateFun docs.
>> >> Alternatively,
>> >> > we could just fix existing Flink Scala code examples to make sure
>> they
>> >> work
>> >> > with the new wrapper. In any case, I see docs will be upgraded/fixed
>> >> > gradually.
>> >> >
>> >> > I hope you will find this idea interesting and worth going forward.
>> >> >
>> >> > P.S. The irony here is that findify/flink-scala-api was also a fork
>> of
>> >> > Flink Scala-API some time ago, so we have a chance to close the loop
>> :-)
>> >> >
>> >> > Best regards.
>> >> > Alexey
>> >> >
>> >> --
>> >> Günter Hipler
>> >> https://openbiblio.social/@vog61
>> >> https://twitter.com/vog61
>> >>
>> >>
>>
>

Reply via email to