Yes, Calcite uses apiguardian. To answer your question Aljoscha, no, I do not use it directly.
It's a dependency of the shaded Calcite version inside the blink JAR. On Thu, Nov 5, 2020 at 11:02 AM Timo Walther <twal...@apache.org> wrote: > Hi Yuval, > > this error is indeed weird. > > @Aljoscha: I think Calcite uses apiguardian. > > When I saw the initial error, it looked like there are different Apache > Calcite versions in the classpath. I'm wondering if this is a pure SBT > issue because I'm sure that other users would have reported this error > earlier. > > Regards, > Timo > > > On 02.11.20 16:00, Aljoscha Krettek wrote: > > But you're not using apiguardian yourself or have it as a dependency > > before this, right? > > > > Best, > > Aljoscha > > > > On 02.11.20 14:59, Yuval Itzchakov wrote: > >> Yes, I'm using SBT. > >> > >> I managed to resolve this by adding: > >> > >> "org.apiguardian" % "apiguardian-api" % "1.1.0" > >> > >> To the dependency list. Perhaps this depedency needs to be shaded as > well > >> in flink-core? > >> > >> My SBT looks roughly like this: > >> > >> lazy val flinkVersion = "1.11.2" > >> libraryDependencies ++= Seq( > >> "org.apache.flink" %% "flink-table-planner-blink" > >> % flinkVersion, > >> "org.apache.flink" %% "flink-table-runtime-blink" > >> % flinkVersion, > >> "org.apache.flink" %% "flink-table-api-scala-bridge" > >> % flinkVersion, > >> "org.apache.flink" % "flink-s3-fs-hadoop" > >> % flinkVersion, > >> "org.apache.flink" %% "flink-container" > >> % flinkVersion, > >> "org.apache.flink" %% "flink-connector-kafka" > >> % flinkVersion, > >> "org.apache.flink" % "flink-connector-base" > >> % flinkVersion, > >> "org.apache.flink" % "flink-table-common" > >> % flinkVersion, > >> "org.apache.flink" %% "flink-cep" > >> % flinkVersion, > >> "org.apache.flink" %% "flink-scala" > >> % flinkVersion % "provided", > >> "org.apache.flink" %% "flink-streaming-scala" > >> % flinkVersion % "provided", > >> "org.apache.flink" % "flink-json" > >> % flinkVersion % "provided", > >> "org.apache.flink" % "flink-avro" > >> % flinkVersion % "provided", > >> "org.apache.flink" %% "flink-parquet" > >> % flinkVersion % "provided", > >> "org.apache.flink" %% "flink-runtime-web" > >> % flinkVersion % "provided", > >> "org.apache.flink" %% "flink-runtime" > >> % flinkVersion % "test" classifier "tests", > >> "org.apache.flink" %% "flink-streaming-java" > >> % flinkVersion % "test" classifier "tests", > >> "org.apache.flink" %% "flink-test-utils" > >> % flinkVersion % "test", > >> ) > >> > >> On Mon, Nov 2, 2020 at 3:21 PM Aljoscha Krettek <aljos...@apache.org> > >> wrote: > >> > >>> @Timo and/or @Jark, have you seen this problem before? > >>> > >>> @Yuval, I'm assuming you're using sbt as a build system, is that > >>> correct? Could you maybe also post a snippet of your build file that > >>> shows the dependency setup or maybe the whole file(s). > >>> > >>> Best, > >>> Aljoscha > >>> > >>> On 01.11.20 13:34, Yuval Itzchakov wrote: > >>>> Hi, > >>>> > >>>> While trying to compile an application with a dependency on > >>>> flink-table-planner_blink_2.12-1.11.2, I receive the following error > >>>> message during compilation: > >>>> > >>>> scalac: While parsing annotations in > /Library/Caches/Coursier/v1/https/ > >>>> > >>> > repo1.maven.org/maven2/org/apache/flink/flink-table-planner-blink_2.12/1.11.2/flink-table-planner-blink_2.12-1.11.2.jar(org/apache/calcite/sql/SqlKind.class) > >>> > >>> , > >>>> could not find EXPERIMENTAL in enum <none>. > >>>> This is likely due to an implementation restriction: an annotation > >>> argument > >>>> cannot refer to a member of the annotated class (scala/bug#7014). > >>>> > >>>> Has anyone encountered this issue? > >>>> > >>> > >>> > >> > > > > -- Best Regards, Yuval Itzchakov.