No problem, glad to hear that it's working now!

With release candidates, we always publish the url for staged artifacts in
the release candidate vote threads so that you can point your code to
compile against those for testing purposes.

Would be great to have your +1 on the vote thread for 3.0.2 Kafka connector.

Best,
Gordon

On Sat, Nov 25, 2023, 10:14 bobobabo <bobob...@bluewin.ch> wrote:

> Thanks Gordon!
>
> I didn't know the name of the repository
> https://repository.apache.org/content/repositories/orgapacheflink-1675/
> Additionally something learned.
>
> Yes, with the new version I can add the dependency
> "org.apache.flink" % "flink-connector-kafka" % "3.0.2-1.18",
>
>
> and compile it without any errors.
>
> Günter
>
>
> On 25.11.23 17:40, Tzu-Li (Gordon) Tai wrote:
> > Hi Günter,
> >
> > With Maven you'd list the staged repository holding the RC artifacts as a
> > repository:
> >
> > ```
> > <repositories>
> >     <repository>
> >        <id>test_kafka_rc</id>
> >        <name>Apache Flink Kafka Connector v3.0.2</name>
> >        <url>
> > https://repository.apache.org/content/repositories/orgapacheflink-1675/
> > </url>
> >     </repository>
> > </repositories>
> > ```
> >
> > With SBT, I think the equivalent is using Resolvers [1]:
> >
> > ```
> > resolvers += "Apache Flink Kafka Connector v3.0.2" at "
> > https://repository.apache.org/content/repositories/orgapacheflink-1675/";
> > ```
> >
> > Hope that helps!
> >
> > Best,
> > Gordon
> >
> > [1] https://www.scala-sbt.org/1.x/docs/Resolvers.html
> >
> > On Sat, Nov 25, 2023 at 12:55 AM guenterh.lists <
> guenterh.li...@bluewin.ch>
> > wrote:
> >
> >> Hi Gordon,
> >>
> >> thanks for working on it.
> >>
> >> How can I reference the repository for the new artifact. Referencing
> >> 3.0.2-18 I get an unresolved dependency error.
> >>
> >> Thanks for a hint.
> >>
> >> Günter
> >>
> >> sbt:flink_essential_swrapper> compile
> >> [info] Updating
> >> [info] Resolved  dependencies
> >> [warn]
> >> [warn]     Note: Unresolved dependencies path:
> >> [error] stack trace is suppressed; run last update for the full output
> >> [error] (update) sbt.librarymanagement.ResolveException: Error
> >> downloading org.apache.flink:flink-connector-kafka:3.0.2-18
> >> [error]   Not found
> >> [error]   Not found
> >> [error]   not found:
> >>
> >>
> /home/swissbib/.ivy2/local/org.apache.flink/flink-connector-kafka/3.0.2-18/ivys/ivy.xml
> >> [error]   not found:
> >>
> >>
> https://repo1.maven.org/maven2/org/apache/flink/flink-connector-kafka/3.0.2-18/flink-connector-kafka-3.0.2-18.pom
> >>
> >>
> >> On 24.11.23 18:30, Tzu-Li (Gordon) Tai wrote:
> >>> Hi all,
> >>>
> >>> I've cherry-picked FLINK-30400 onto v3.0 branch of
> flink-connector-kafka.
> >>>
> >>> Treating this thread as justification to start a vote for 3.0.2 RC #1
> >>> immediately so we can get out a new release ASAP. Please see the vote
> >>> thread here [1].
> >>>
> >>> @guenterh.lists <guenterh.li...@bluewin.ch> Would you be able to test
> >> this
> >>> RC and see if the issue is resolved for you? It should work simply by
> >>> having a dependency on flink-streaming-java and flink-clients for
> 1.18.0,
> >>> as well as flink-connector-kafka 3.0.2-18. The flink-connector-base
> >>> dependency you added in the end as a workaround should not be needed.
> >>>
> >>> Thanks,
> >>> Gordon
> >>>
> >>> [1] https://lists.apache.org/thread/34zb5pnymfltrz607wqcb99h7675zdpj
> >>>
> >>> On Fri, Nov 24, 2023 at 5:16 AM Leonard Xu <xbjt...@gmail.com> wrote:
> >>>
> >>>>      - built a fat uber jar from quickstart with Flink 1.18.0 for
> >>>>      flink-streaming-java and flink-clients, and flink-connector-kafka
> >> version
> >>>>      3.0.1-1.18
> >>>>      - then submitted to local Flink cluster 1.18.0. Things worked as
> >>>>      expected and the job ran fine.
> >>>>
> >>>> Hey,@Gordan
> >>>> I guess things may work as expected when you submit your fat jar job
> to
> >>>> cluster, because  flink-connector-base (1.18.0 in this case) has been
> >>>> included to flink-dist jar [1] which will appear in your classpath,
> >> but it
> >>>> may meet issue when you run in local IDE environment, maybe you can
> >> have a
> >>>> local test to verify this.
> >>>>
> >>>> In the end, I think we need to backport FLINK-30400 to the Flink Kafka
> >>>> connector 3.0 branch and prepare a 3.0.2 soon.
> >>>>
> >>>> Best,
> >>>> Leonard
> >>>> [1]
> >>>>
> >>
> https://github.com/apache/flink/blob/977463cce3ea0f88e2f184c30720bf4e8e97fd4a/flink-dist/pom.xml#L156
> >> --
> >> Günter Hipler
> >> https://openbiblio.social/@vog61
> >> https://twitter.com/vog61
> >>
> >>
>

Reply via email to