Re: dependency error with latest Kafka connector

2023-11-25 Thread guenterh.lists

Thanks Gordon!

I didn't know the name of the repository
https://repository.apache.org/content/repositories/orgapacheflink-1675/
Additionally something learned.

Yes, with the new version I can add the dependency
"org.apache.flink" % "flink-connector-kafka" % "3.0.2-1.18",


and compile it without any errors.

Günter


On 25.11.23 17:40, Tzu-Li (Gordon) Tai wrote:

Hi Günter,

With Maven you'd list the staged repository holding the RC artifacts as a
repository:

```


   test_kafka_rc
   Apache Flink Kafka Connector v3.0.2
   
https://repository.apache.org/content/repositories/orgapacheflink-1675/



```

With SBT, I think the equivalent is using Resolvers [1]:

```
resolvers += "Apache Flink Kafka Connector v3.0.2" at "
https://repository.apache.org/content/repositories/orgapacheflink-1675/;
```

Hope that helps!

Best,
Gordon

[1] https://www.scala-sbt.org/1.x/docs/Resolvers.html

On Sat, Nov 25, 2023 at 12:55 AM guenterh.lists 
wrote:


Hi Gordon,

thanks for working on it.

How can I reference the repository for the new artifact. Referencing
3.0.2-18 I get an unresolved dependency error.

Thanks for a hint.

Günter

sbt:flink_essential_swrapper> compile
[info] Updating
[info] Resolved  dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last update for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error
downloading org.apache.flink:flink-connector-kafka:3.0.2-18
[error]   Not found
[error]   Not found
[error]   not found:

/home/swissbib/.ivy2/local/org.apache.flink/flink-connector-kafka/3.0.2-18/ivys/ivy.xml
[error]   not found:

https://repo1.maven.org/maven2/org/apache/flink/flink-connector-kafka/3.0.2-18/flink-connector-kafka-3.0.2-18.pom


On 24.11.23 18:30, Tzu-Li (Gordon) Tai wrote:

Hi all,

I've cherry-picked FLINK-30400 onto v3.0 branch of flink-connector-kafka.

Treating this thread as justification to start a vote for 3.0.2 RC #1
immediately so we can get out a new release ASAP. Please see the vote
thread here [1].

@guenterh.lists  Would you be able to test

this

RC and see if the issue is resolved for you? It should work simply by
having a dependency on flink-streaming-java and flink-clients for 1.18.0,
as well as flink-connector-kafka 3.0.2-18. The flink-connector-base
dependency you added in the end as a workaround should not be needed.

Thanks,
Gordon

[1] https://lists.apache.org/thread/34zb5pnymfltrz607wqcb99h7675zdpj

On Fri, Nov 24, 2023 at 5:16 AM Leonard Xu  wrote:


 - built a fat uber jar from quickstart with Flink 1.18.0 for
 flink-streaming-java and flink-clients, and flink-connector-kafka

version

 3.0.1-1.18
 - then submitted to local Flink cluster 1.18.0. Things worked as
 expected and the job ran fine.

Hey,@Gordan
I guess things may work as expected when you submit your fat jar job to
cluster, because  flink-connector-base (1.18.0 in this case) has been
included to flink-dist jar [1] which will appear in your classpath,

but it

may meet issue when you run in local IDE environment, maybe you can

have a

local test to verify this.

In the end, I think we need to backport FLINK-30400 to the Flink Kafka
connector 3.0 branch and prepare a 3.0.2 soon.

Best,
Leonard
[1]


https://github.com/apache/flink/blob/977463cce3ea0f88e2f184c30720bf4e8e97fd4a/flink-dist/pom.xml#L156
--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



Re: dependency error with latest Kafka connector

2023-11-25 Thread guenterh.lists

Hi Gordon,

thanks for working on it.

How can I reference the repository for the new artifact. Referencing 
3.0.2-18 I get an unresolved dependency error.


Thanks for a hint.

Günter

sbt:flink_essential_swrapper> compile
[info] Updating
[info] Resolved  dependencies
[warn]
[warn]     Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last update for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error 
downloading org.apache.flink:flink-connector-kafka:3.0.2-18

[error]   Not found
[error]   Not found
[error]   not found: 
/home/swissbib/.ivy2/local/org.apache.flink/flink-connector-kafka/3.0.2-18/ivys/ivy.xml
[error]   not found: 
https://repo1.maven.org/maven2/org/apache/flink/flink-connector-kafka/3.0.2-18/flink-connector-kafka-3.0.2-18.pom



On 24.11.23 18:30, Tzu-Li (Gordon) Tai wrote:

Hi all,

I've cherry-picked FLINK-30400 onto v3.0 branch of flink-connector-kafka.

Treating this thread as justification to start a vote for 3.0.2 RC #1
immediately so we can get out a new release ASAP. Please see the vote
thread here [1].

@guenterh.lists  Would you be able to test this
RC and see if the issue is resolved for you? It should work simply by
having a dependency on flink-streaming-java and flink-clients for 1.18.0,
as well as flink-connector-kafka 3.0.2-18. The flink-connector-base
dependency you added in the end as a workaround should not be needed.

Thanks,
Gordon

[1] https://lists.apache.org/thread/34zb5pnymfltrz607wqcb99h7675zdpj

On Fri, Nov 24, 2023 at 5:16 AM Leonard Xu  wrote:



- built a fat uber jar from quickstart with Flink 1.18.0 for
flink-streaming-java and flink-clients, and flink-connector-kafka version
3.0.1-1.18
- then submitted to local Flink cluster 1.18.0. Things worked as
expected and the job ran fine.

Hey,@Gordan
I guess things may work as expected when you submit your fat jar job to
cluster, because  flink-connector-base (1.18.0 in this case) has been
included to flink-dist jar [1] which will appear in your classpath,  but it
may meet issue when you run in local IDE environment, maybe you can have a
local test to verify this.

In the end, I think we need to backport FLINK-30400 to the Flink Kafka
connector 3.0 branch and prepare a 3.0.2 soon.

Best,
Leonard
[1]
https://github.com/apache/flink/blob/977463cce3ea0f88e2f184c30720bf4e8e97fd4a/flink-dist/pom.xml#L156


--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



Re: dependency error with latest Kafka connector

2023-11-23 Thread guenterh.lists

Hi Danny

thanks for taking a look into it and for the hint.

Your assumption is correct - It compiles when the base connector is 
excluded.


In sbt:
"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18" 
exclude("org.apache.flink", "flink-connector-base"),


Günter


On 23.11.23 14:24, Danny Cranmer wrote:

Hey all,

I believe this is because of FLINK-30400. Looking at the pom I cannot see
any other dependencies that would cause a problem. To workaround this, can
you try to remove that dependency from your build?


 org.apache.flink
 flink-connector-kafka
 3.0.1-1.18
 
 
 org.apache.flink
 flink-connector-base
 
 



Alternatively you can add it in:


 org.apache.flink
 flink-connector-base
 1.18.0


Sorry I am not sure how to do this in Scala SBT.

Agree we should get this fixed and push a 3.0.2 Kafka connector.

Thanks,
Danny

[1] https://issues.apache.org/jira/browse/FLINK-30400

On Thu, Nov 23, 2023 at 12:39 PM Leonard Xu  wrote:


Hi, Gurnterh

It seems a bug for me that  3.0.1-1.18 flink Kafka connector use  flink
  1.17 dependency which lead to your issue.

I guess we need propose a new release for Kafka connector for fix this
issue.

CC: Gordan, Danny, Martijn

Best,
Leonard

2023年11月14日 下午6:53,Alexey Novakov via user  写道:

Hi Günterh,

It looks like a problem with the Kafka connector release.
https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka/3.0.1-1.18
Compile dependencies are still pointing to Flink 1.17.

Release person is already contacted about this or will be contacted soon.

Best regards,
Alexey

On Mon, Nov 13, 2023 at 10:42 PM guenterh.lists 
wrote:


Hello

I'm getting a dependency error when using the latest Kafka connector in
a Scala project.

Using the 1.17.1 Kafka connector compilation is ok.

With

"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18"

I get
[error] (update) sbt.librarymanagement.ResolveException: Error
downloading org.apache.flink:flink-connector-base:
[error]   Not found
[error]   Not found
[error]   not found:

/home/swissbib/.ivy2/local/org.apache.flink/flink-connector-base/ivys/ivy.xml
[error]   not found:

https://repo1.maven.org/maven2/org/apache/flink/flink-connector-base//flink-connector-base-.pom

Seems Maven packaging is not correct.

My sbt build file:

ThisBuild / scalaVersion := "3.3.0"
val flinkVersion = "1.18.0"
val postgresVersion = "42.2.2"

lazy val root = (project in file(".")).settings(
name := "flink-scala-proj",
libraryDependencies ++= Seq(
  "org.flinkextended" %% "flink-scala-api" % "1.17.1_1.1.0",
  "org.apache.flink" % "flink-clients" % flinkVersion % Provided,
  "org.apache.flink" % "flink-connector-files" % flinkVersion %
Provided,

"org.apache.flink" % "flink-connector-kafka" % "1.17.1",
//"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18",

//"org.apache.flink" % "flink-connector-jdbc" % "3.1.1-1.17",
//"org.postgresql" % "postgresql" % postgresVersion,
"org.apache.flink" % "flink-connector-files" % flinkVersion % Provided,
//"org.apache.flink" % "flink-connector-base" % flinkVersion % Provided
)
)



Thanks!

--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



Re: dependency error with latest Kafka connector

2023-11-17 Thread guenterh.lists

Same behaviour and assumption on my side Alexey. Thanks for testing it.

Günter

On 17.11.23 09:40, Alexey Novakov via user wrote:

I would expect *flink-connector-kafka:3.0.1-1.18* pointing to
*org.apache.flink:flink-connector-base:1.18.0* not to *1.17.0*

However, SBT compiles my project ok using such versioning:

val flinkVersion = "1.18.0"
val flinkVersion17 = "1.17.1"

val flinkDependencies = Seq(
"org.flinkextended" %% "flink-scala-api" % s"${flinkVersion17}_1.1.0", //
will support soon 1.18 too
"org.apache.flink" % "flink-connector-base" % flinkVersion % Provided,
"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18" % Provided,
"org.apache.flink" % "flink-connector-files" % flinkVersion % Provided
)

Best regards,
Alexey

On Tue, Nov 14, 2023 at 1:45 PM Alexis Sarda-Espinosa <
sarda.espin...@gmail.com> wrote:


Isn't it expected that it points to 1.17? That version of the Kafka
connector is meant to be compatible with both Flink 1.17 and 1.18, right?
So the older version should be specified so that the consumer can decide
which Flink version to compile against, otherwise the build tool could
silently update the compile-only dependencies, no?

Regards,
Alexis.

Am Di., 14. Nov. 2023 um 11:54 Uhr schrieb Alexey Novakov via user <
user@flink.apache.org>:


Hi Günterh,

It looks like a problem with the Kafka connector release.
https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka/3.0.1-1.18
Compile dependencies are still pointing to Flink 1.17.

Release person is already contacted about this or will be contacted soon.

Best regards,
Alexey

On Mon, Nov 13, 2023 at 10:42 PM guenterh.lists <
guenterh.li...@bluewin.ch> wrote:


Hello

I'm getting a dependency error when using the latest Kafka connector in
a Scala project.

Using the 1.17.1 Kafka connector compilation is ok.

With

"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18"

I get
[error] (update) sbt.librarymanagement.ResolveException: Error
downloading org.apache.flink:flink-connector-base:
[error]   Not found
[error]   Not found
[error]   not found:

/home/swissbib/.ivy2/local/org.apache.flink/flink-connector-base/ivys/ivy.xml
[error]   not found:

https://repo1.maven.org/maven2/org/apache/flink/flink-connector-base//flink-connector-base-.pom

Seems Maven packaging is not correct.

My sbt build file:

ThisBuild / scalaVersion := "3.3.0"
val flinkVersion = "1.18.0"
val postgresVersion = "42.2.2"

lazy val root = (project in file(".")).settings(
name := "flink-scala-proj",
libraryDependencies ++= Seq(
  "org.flinkextended" %% "flink-scala-api" % "1.17.1_1.1.0",
  "org.apache.flink" % "flink-clients" % flinkVersion % Provided,
  "org.apache.flink" % "flink-connector-files" % flinkVersion %
Provided,

"org.apache.flink" % "flink-connector-kafka" % "1.17.1",
//"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18",

//"org.apache.flink" % "flink-connector-jdbc" % "3.1.1-1.17",
//"org.postgresql" % "postgresql" % postgresVersion,
"org.apache.flink" % "flink-connector-files" % flinkVersion %
Provided,
//"org.apache.flink" % "flink-connector-base" % flinkVersion %
Provided
)
)



Thanks!

--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



dependency error with latest Kafka connector

2023-11-13 Thread guenterh.lists

Hello

I'm getting a dependency error when using the latest Kafka connector in 
a Scala project.


Using the 1.17.1 Kafka connector compilation is ok.

With

"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18"

I get
[error] (update) sbt.librarymanagement.ResolveException: Error 
downloading org.apache.flink:flink-connector-base:

[error]   Not found
[error]   Not found
[error]   not found: 
/home/swissbib/.ivy2/local/org.apache.flink/flink-connector-base/ivys/ivy.xml
[error]   not found: 
https://repo1.maven.org/maven2/org/apache/flink/flink-connector-base//flink-connector-base-.pom


Seems Maven packaging is not correct.

My sbt build file:

ThisBuild / scalaVersion := "3.3.0"
val flinkVersion = "1.18.0"
val postgresVersion = "42.2.2"

lazy val root = (project in file(".")).settings(
  name := "flink-scala-proj",
  libraryDependencies ++= Seq(
    "org.flinkextended" %% "flink-scala-api" % "1.17.1_1.1.0",
    "org.apache.flink" % "flink-clients" % flinkVersion % Provided,
    "org.apache.flink" % "flink-connector-files" % flinkVersion % Provided,

  "org.apache.flink" % "flink-connector-kafka" % "1.17.1",
  //"org.apache.flink" % "flink-connector-kafka" % "3.0.1-1.18",

  //"org.apache.flink" % "flink-connector-jdbc" % "3.1.1-1.17",
  //"org.postgresql" % "postgresql" % postgresVersion,
  "org.apache.flink" % "flink-connector-files" % flinkVersion % Provided,
  //"org.apache.flink" % "flink-connector-base" % flinkVersion % Provided
  )
)



Thanks!

--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



Re: [Discussion] - Take findify/flink-scala-api under Flink umbrella

2023-04-16 Thread guenterh.lists

Hello Alexey

Thank you for your initiative and your suggestion!

I can only fully support the following statements in your email:

>Taking into account my Scala experience for the last 8 years, I 
predict these wrappers will eventually be abandoned, unless such a Scala 
library is a part of some bigger community like ASF.
>Also, non-official Scala API will lead people to play safe and choose 
Java API only, even if they didn't want that at the beginning.


Second sentence is my current state.

From my point of view it would be very unfortunate if the Flink project 
would lose the Scala API and thus the integration of concise, flexible 
and future-oriented language constructs of the Scala language (and 
further development of version 3).


Documentation of the API is essential. I would be interested to support 
this efforts.


Best wishes

Günter


On 13.04.23 15:39, Alexey Novakov via user wrote:

Hello Flink PMCs and Flink Scala Users,

I would like to propose an idea to take the 3rd party Scala API
findify/flink-scala-api 
project into the Apache Flink organization.

*Motivation *

The Scala-free Flink idea was finally implemented by the 1.15 release and
allowed Flink users to bring their own Scala version and use it via the
Flink Java API. See blog-post here: Scala Free in One Fifteen
. Also,
existing Flink Scala API will be deprecated, because it is too hard to
upgrade it to Scala 2.13 or 3.

Taking into account my Scala experience for the last 8 years, I predict
these wrappers will eventually be abandoned, unless such a Scala library is
a part of some bigger community like ASF.
Also, non-official Scala API will lead people to play safe and choose Java
API only, even if they did want that at the beginning.

https://github.com/findify/flink-scala-api has already advanced and
implemented Scala support for 2.13 and 3 versions on top of Flink Java API.
As I know, FIndify does not want or does not have a capacity to maintain
this library. I propose to fork this great library and create a new Flink
project with its own version and build process (SBT, not Maven), which
would be similar to the StateFun or FlinkML projects.

*Proposal *

1. Create a fork of findify/flink-scala-api and host in Apache Flink Git
space (PMCs please advise).
2. I and Roman

would
be willing to maintain this library in future for the next several years.
Further, we believe it will live on its own.
3. Flink Docs: PMCs, we need your guidelines here. One way I see is to
create new documentation in a similar way as StateFun docs. Alternatively,
we could just fix existing Flink Scala code examples to make sure they work
with the new wrapper. In any case, I see docs will be upgraded/fixed
gradually.

I hope you will find this idea interesting and worth going forward.

P.S. The irony here is that findify/flink-scala-api was also a fork of
Flink Scala-API some time ago, so we have a chance to close the loop :-)

Best regards.
Alexey


--
Günter Hipler
https://openbiblio.social/@vog61
https://twitter.com/vog61



Re: videos Flink Forward San Francisco 2022

2022-10-12 Thread guenterh.lists

Thanks for your open feedback Jun - I appreciate it

Very best wishes from Basel

Günter

On 11.10.22 18:12, Jun Qin wrote:

Hi

Totally agree, rest assured that it was some venue limitations and 
some post-pandemic organizational challenges that meant no videos this 
year. Thanks a lot for the feedback and please let's stay positive and 
not draw the wrong conclusions.


Thanks
Jun

On Oct 10, 2022, at 2:39 PM, guenterh.lists 
 wrote:


really very sad - as far as I know this happens for the first time, 
attitude of new Ververica?


Hopefully immerok may resume the open mentality of data artisans.

Günter

On 10.10.22 11:26, Martijn Visser wrote:

Hi Günter,

I've understood that only the keynotes were recorded and not the 
other sessions.


Best regards,

Martijn

On Sun, Oct 9, 2022 at 4:10 PM guenterh.lists 
 wrote:


Sorry if this question was already posted

By now only a few videos of the conference were published
(mainly the
keynotes)
https://www.youtube.com/playlist?list=PLDX4T_cnKjD10qp1y2B4sLNW5KL_P6RuB

Are the other presentations not going to be published?

Günter




--
Günter Hipler
University library Leipzig


Re: videos Flink Forward San Francisco 2022

2022-10-10 Thread guenterh.lists
really very sad - as far as I know this happens for the first time, 
attitude of new Ververica?


Hopefully immerok may resume the open mentality of data artisans.

Günter

On 10.10.22 11:26, Martijn Visser wrote:

Hi Günter,

I've understood that only the keynotes were recorded and not the other 
sessions.


Best regards,

Martijn

On Sun, Oct 9, 2022 at 4:10 PM guenterh.lists 
 wrote:


Sorry if this question was already posted

By now only a few videos of the conference were published (mainly the
keynotes)
https://www.youtube.com/playlist?list=PLDX4T_cnKjD10qp1y2B4sLNW5KL_P6RuB

Are the other presentations not going to be published?

Günter


videos Flink Forward San Francisco 2022

2022-10-09 Thread guenterh.lists

Sorry if this question was already posted

By now only a few videos of the conference were published (mainly the 
keynotes)

https://www.youtube.com/playlist?list=PLDX4T_cnKjD10qp1y2B4sLNW5KL_P6RuB

Are the other presentations not going to be published?

Günter



Re: [DISCUSS] FLIP-265 Deprecate and remove Scala API support

2022-10-09 Thread guenterh.lists

Hi Martijn

I do not maintain a large production application based on Flink, so it 
would not be a problem for me to convert existing implementations to 
whatever API.


I am working in the area of cultural heritage, which is mainly about the 
processing of structured (meta)-data (scientific libraries, archives and 
museums)
My impression: People without much background/experience with Java 
implementations find it easier to get into the functional mindset as 
supported in Scala. That's why I think it would be very unfortunate if 
the use of Scala in Flink becomes more and more limited or neglected.


I think using the Java API in Scala is a possible way also in my 
environment.


In the last weeks I tried to port the examples from the "Flink Course" 
of Daniel Ciorcilan (https://rockthejvm.com/p/flink - he mainly offers 
Scala courses), which are exclusively based on the native Scala API, to 
the Java API. This has worked without any problems as far as I can see. 
So far I haven't tried any examples based on the Table API or streaming 
workflows in batch mode (which would be important for our environment).


My main trouble: So far I don't know enough about the limitations of 
using the Java API in a Scala implementation and what that means. My 
current understanding: the limitation is mainly in deriving the type 
information in generic APIs with Scala types. For me it would be very 
significant and helpful if there would be more information, descriptions 
and examples about this topic.


So far unfortunately I had too little time to deal with a wrapper like 
flink-scala-api (https://github.com/findify/flink-scala-api ) and the 
current alternative is probably going to be deprecated in the future 
(https://github.com/ariskk/flink4s/issues/17#issuecomment-1125806808 )


Günter


On 04.10.22 13:58, Martijn Visser wrote:

Hi Marton,

You're making a good point, I originally wanted to include already the 
User mailing list to get their feedback but forgot to do so. I'll do 
some more outreach via other channels as well.


@Users of Flink, I've made a proposal to deprecate and remove Scala 
API support in a future version of Flink. Your feedback on this topic 
is very much appreciated.


Regarding the large Scala codebase for Flink, a potential alternative 
could be to have a wrapper for all Java APIs that makes them available 
as Scala APIs. However, this still requires Scala maintainers and I 
don't think that we currently have those in our community. The easiest 
solution for them would be to use the Java APIs directly. Yes it would 
involve work, but we won't actually be able to remove the Scala APIs 
until Flink 2.0 so there's still time for that :)


Best regards,

Martijn

On Tue, Oct 4, 2022 at 1:26 AM Márton Balassi 
 wrote:


Hi Martjin,

Thanks for compiling the FLIP. I agree with the sentiment that
Scala poses
considerable maintenance overhead and key improvements (like 2.13
or 2.12.8
supports) are hanging stale. With that said before we make this
move we
should attempt to understand the userbase affected.
A quick Slack and user mailing list search does return quite a bit of
results for scala (admittedly a cursory look at them suggest that
many of
them have to do with missing features in Scala that exist in Java
or Scala
versions). I would love to see some polls on this topic, we could
also use
the Flink twitter handle to ask the community about this.

I am aware of users having large existing Scala codebases for
Flink. This
move would pose a very large effort on them, as they would need to
rewrite
much of their existing code. What are the alternatives in your
opinion,
Martjin?

On Tue, Oct 4, 2022 at 6:22 AM Martijn Visser

wrote:

> Hi everyone,
>
> I would like to open a discussion thread on FLIP-265 Deprecate
and remove
> Scala API support. Please take a look at
>
>

https://cwiki.apache.org/confluence/display/FLINK/FLIP-265+Deprecate+and+remove+Scala+API+support
> and provide your feedback.
>
> Best regards,
>
> Martijn
> https://twitter.com/MartijnVisser82
> https://github.com/MartijnVisser
>


--
Günter Hipler
University library Leipzig


Re: use of Scala versions >= 2.13 in Flink 1.15

2021-12-07 Thread guenterh.lists

Hi Chesnay,

thanks for the info - this is really good news for us.

I set up a playground using the snapshot from yesterday [1] and a really 
quick and short Job using Scala 2.13 [2]


The job starts and returns correct results. Even the use of a case class 
against the Java API is possible.


Then I made a second try with the same job (compiled with Scala 2.13.6) 
running on a Flink 1.14 cluster which was again successful.


My question:
Is this compilation with Scala versions >=2.13 already part of 1.14 or 
is my example too small and simple that binary incompatibilities between 
the versions doesn't matter?


Günter


[1] 
https://gitlab.com/guenterh/flink_1_15_scala_2_13/-/tree/main/flink-1.15-SNAPSHOT
[2] 
https://gitlab.com/guenterh/flink_1_15_scala_2_13/-/blob/main/flink_scala_213/build.sbt#L12

https://gitlab.com/guenterh/flink_1_15_scala_2_13/-/blob/main/flink_scala_213/src/main/scala/de/ub/unileipzig/Job.scala#L8


On 06.12.21 13:59, Chesnay Schepler wrote:
With regards to the Java APIs, you will definitely be able to use the 
Java DataSet/DataStream APIs from Scala without any restrictions 
imposed by Flink. This is already working with the current SNAPSHOT 
version.


As we speak we are also working to achieve the same for the Table API; 
we expect to achieve that but with some caveats (i.e., if you use the 
Python API or the Hive connector then you still need to use the Scala 
version provided by Flink).


As for the Scala APIs, we haven't really decided yet how this will 
work in the future. However, one of the big benefits of the Scala-free 
runtime is that it should now be easier for us to release the APIs for 
more Scala versions.


On 06/12/2021 11:47, guenterh.lists wrote:

Dear list,

there have been some discussions and activities in the last months 
about a Scala free runtime which should make it possible to use newer 
Scala version (>= 2.13 / 3.x) on the application side.


Stephan Ewen announced the implementation is on the way [1] and 
Martijn Vissr mentioned in the ask me anything session on version 
1.14 that it is planned to make this possible in the upcoming 1.15 
version (~ next February ) [2]


This would be very nice for our currently started project where we 
are discussing the used tools and infrastructure. "Personally" I 
would prefer that people with less experience on the JVM could make 
their start and first experiences with a "pythonized" Scala using the 
last versions of the language (2.13.x or maybe 3.x).


My question: Do you think your plans to provide the possibility of a 
Scala free runtime with the upcoming version is still realistic?


Out of curiosity: If you can make this possible and applications with 
current Scala versions are going to use the Java APIs of Flink what's 
the future of the current Scala API of Flink where you have to decide 
to use either Scala 2.11 or <2.12.8?

Is this then still possible as an alternative?

Thanks for some hints for our planning and decisions

Günter




[1] https://twitter.com/data_fly/status/1415012793347149830
[2] https://www.youtube.com/watch?v=wODmlow0ip0




--
Günter Hipler
University library Leipzig



use of Scala versions >= 2.13 in Flink 1.15

2021-12-06 Thread guenterh.lists

Dear list,

there have been some discussions and activities in the last months about 
a Scala free runtime which should make it possible to use newer Scala 
version (>= 2.13 / 3.x) on the application side.


Stephan Ewen announced the implementation is on the way [1] and Martijn 
Vissr mentioned in the ask me anything session on version 1.14 that it 
is planned to make this possible in the upcoming 1.15 version (~ next 
February ) [2]


This would be very nice for our currently started project where we are 
discussing the used tools and infrastructure. "Personally" I would 
prefer that people with less experience on the JVM could make their 
start and first experiences with a "pythonized" Scala using the last 
versions of the language (2.13.x or maybe 3.x).


My question: Do you think your plans to provide the possibility of a 
Scala free runtime with the upcoming version is still realistic?


Out of curiosity: If you can make this possible and applications with 
current Scala versions are going to use the Java APIs of Flink what's 
the future of the current Scala API of Flink where you have to decide to 
use either Scala 2.11 or <2.12.8?

Is this then still possible as an alternative?

Thanks for some hints for our planning and decisions

Günter




[1] https://twitter.com/data_fly/status/1415012793347149830
[2] https://www.youtube.com/watch?v=wODmlow0ip0

--
Günter Hipler
university library Leipzig



Re: docker based taskmanager can't connect to job/resource manager

2021-05-12 Thread guenterh.lists

Hi Guowei,

thanks for your reply! This information was still missing. The presenter 
mentioned the documentation but I hadn't found it. So your link to the 
specific place is valuable too.


Günter

On 13.05.21 06:09, Guowei Ma wrote:

Hi,
I do not try it. But from the documentation[1] it seems that you might 
need add the "jobmanager.rpc.address: jobmanager" to 
the FLINK_PROPERTIES before creating a network.


[1] 
https://ci.apache.org/projects/flink/flink-docs-master/docs/deployment/resource-providers/standalone/docker/ 
<https://ci.apache.org/projects/flink/flink-docs-master/docs/deployment/resource-providers/standalone/docker/>


Best,
Guowei


On Thu, May 13, 2021 at 3:56 AM guenterh.lists 
mailto:guenterh.li...@bluewin.ch>> wrote:


Hi,

I'm trying to start a mini cluster following the explanations
given in a
flink forward presentation [1]

Starting a jobmanager task is possible

FLINK_PROPERTIES="jobmanager.memory.process.size: 2048m
parallelism.default: 4
"
docker network create flink-network

docker run  \
--rm   \
--name=jobmanager  \
--network flink-network \
-p 8081:8081  \
--env FLINK_PROPERTIES="${FLINK_PROPERTIES}"  \
flink:1.13.0-scala_2.12-java11 jobmanager


Unfortunately the taskmanager process can't connect

docker run  \
--rm   \
--name=taskmanager  \
--network flink-network \
--env FLINK_PROPERTIES="${FLINK_PROPERTIES}"  \
flink:1.13.0-scala_2.12-java11 taskmanager

2021-05-12 19:43:11,396 INFO
org.apache.flink.runtime.net
<http://org.apache.flink.runtime.net>.ConnectionUtils [] - Failed
to connect from address '/172.20.0.3 <http://172.20.0.3>':
Connection refused (Connection
refused)
2021-05-12 19:44:26,082 WARN
akka.remote.transport.netty.NettyTransport [] - Remote
connection to [null] failed with java.net.ConnectException:
Connection
refused: 5e8efb79f191/172.20.0.3:6123 <http://172.20.0.3:6123>
2021-05-12 19:44:26,084 INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor [] - Could
not resolve ResourceManager address
akka.tcp://flink@5e8efb79f191:6123/user/rpc/resourcemanager_*,
retrying
in 1 ms: Could not connect to rpc endpoint under address
akka.tcp://flink@5e8efb79f191:6123/user/rpc/resourcemanager_*.
2021-05-12 19:44:26,084 WARN
akka.remote.ReliableDeliverySupervisor [] -
Association with remote system
[akka.tcp://flink@5e8efb79f191:6123] has
failed, address is now gated for [50] ms. Reason: [Association failed
with [akka.tcp://flink@5e8efb79f191:6123]] Caused by:
[java.net.ConnectException: Connection refused:
5e8efb79f191/172.20.0.3:6123 <http://172.20.0.3:6123>]

and the dashboard (of the jobmanager task) doesn't show the
taskmanager
process (as I would expect)

Any hints? - Thanks!

Günter


[1]

https://www.youtube.com/watch?v=VVh6ikd-l9s=PLDX4T_cnKjD054YExbUOkr_xdYknVPQUm=45

<https://www.youtube.com/watch?v=VVh6ikd-l9s=PLDX4T_cnKjD054YExbUOkr_xdYknVPQUm=45>
"Flink's New Dockerfile: One File to Rule Them All"



docker based taskmanager can't connect to job/resource manager

2021-05-12 Thread guenterh.lists

Hi,

I'm trying to start a mini cluster following the explanations given in a 
flink forward presentation [1]


Starting a jobmanager task is possible

FLINK_PROPERTIES="jobmanager.memory.process.size: 2048m
parallelism.default: 4
"
docker network create flink-network

docker run  \
--rm   \
--name=jobmanager  \
--network flink-network \
-p 8081:8081  \
--env FLINK_PROPERTIES="${FLINK_PROPERTIES}"  \
flink:1.13.0-scala_2.12-java11 jobmanager


Unfortunately the taskmanager process can't connect

docker run  \
--rm   \
--name=taskmanager  \
--network flink-network \
--env FLINK_PROPERTIES="${FLINK_PROPERTIES}"  \
flink:1.13.0-scala_2.12-java11 taskmanager

2021-05-12 19:43:11,396 INFO 
org.apache.flink.runtime.net.ConnectionUtils [] - Failed 
to connect from address '/172.20.0.3': Connection refused (Connection 
refused)
2021-05-12 19:44:26,082 WARN 
akka.remote.transport.netty.NettyTransport   [] - Remote 
connection to [null] failed with java.net.ConnectException: Connection 
refused: 5e8efb79f191/172.20.0.3:6123
2021-05-12 19:44:26,084 INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor   [] - Could 
not resolve ResourceManager address 
akka.tcp://flink@5e8efb79f191:6123/user/rpc/resourcemanager_*, retrying 
in 1 ms: Could not connect to rpc endpoint under address 
akka.tcp://flink@5e8efb79f191:6123/user/rpc/resourcemanager_*.
2021-05-12 19:44:26,084 WARN 
akka.remote.ReliableDeliverySupervisor   [] - 
Association with remote system [akka.tcp://flink@5e8efb79f191:6123] has 
failed, address is now gated for [50] ms. Reason: [Association failed 
with [akka.tcp://flink@5e8efb79f191:6123]] Caused by: 
[java.net.ConnectException: Connection refused: 
5e8efb79f191/172.20.0.3:6123]


and the dashboard (of the jobmanager task) doesn't show the taskmanager 
process (as I would expect)


Any hints? - Thanks!

Günter


[1] 
https://www.youtube.com/watch?v=VVh6ikd-l9s=PLDX4T_cnKjD054YExbUOkr_xdYknVPQUm=45

"Flink's New Dockerfile: One File to Rule Them All"