I prefer using flink bundled hadoop, such as
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar.
May help.

L. C. Hsieh <vii...@apache.org> 于2021年8月22日周日 上午1:40写道:

>
> BTW, I checked dependency tree, the flink-iceberg demo only has one Hadoop
> common dependency. So I'm not sure why Flink throws such exception. Based
> on Flink doc, I suppose that Flink binary doesn't include Hadoop
> dependencies, right?
>
> Based on the exception, looks like when FlinkCatalogFactory (from Iceberg)
> calls HadoopUtils.getHadoopConfiguration (from Flink), their classloaders
> are different and referring to different Hadoop Configuration Class objects.
>
> I'm not familiar with Flink. So I'm wondering what step is wrong during
> the testing? It is a pretty simple test to verify Iceberg and Flink.
>
> On 2021/08/21 08:50:05, L. C. Hsieh <vii...@apache.org> wrote:
> >
> > Thanks for replying.
> >
> > I'm using Flink 1.12.x. And I think Iceberg 0.12 uses Flink 1.12
> actually.
> >
> > Once I upgraded the Iceberg from 0.11.0 to 0.12.0 for the Java
> application. I got new exception as below:
> >
> > java.lang.LinkageError: loader constraint violation: when resolving
> method
> "org.apache.flink.runtime.util.HadoopUtils.getHadoopConfiguration(Lorg/apache/flink/configuration/Configuration;)Lorg/apache/hadoop/conf/Configuration;"
> the class loader (instance of org/apache/flink/util/ChildFirstClassLoader)
> of the current class, org/apache/iceberg/flink/FlinkCatalogFactory, and the
> class loader (instance of sun/misc/Launcher$AppClassLoader) for the
> method's defining class, org/apache/flink/runtime/util/HadoopUtils, have
> different Class objects for the type org/apache/hadoop/conf/Configuration
> used in the signature at
> org.apache.iceberg.flink.FlinkCatalogFactory.clusterHadoopConf(FlinkCatalogFactory.java:152)
> >
> >
> > On 2021/08/21 08:11:33, Manong Karl <abc549...@gmail.com> wrote:
> > > Iceberg v0.11 or v0.12 not capable with flink v1.13.x.
> > >
> > > L. C. Hsieh <vii...@apache.org> 于2021年8月21日周六 下午3:52写道:
> > >
> > > > Hi, I'm testing using Flink to write Iceberg table. I run Flink
> native K8S
> > > > cluster locally and submit a simple Java program that writes out
> Iceberg
> > > > table (https://github.com/spancer/flink-iceberg-demo). But got an
> > > > exception:
> > > >
> > > > java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
> at
> > > >
> org.apache.iceberg.flink.FlinkCatalogFactory.clusterHadoopConf(FlinkCatalogFactory.java:148)
> > > > at
> > > >
> org.apache.iceberg.flink.TableLoader.fromHadoopTable(TableLoader.java:49)
> > > > at
> > > >
> com.coomia.iceberg.test.IcebergReadWriteTest.main(IcebergReadWriteTest.java:89)
> > > >
> > > > The uploaded is a fat jar. I also checked the uploaded application
> jar. It
> > > > has the Configuration class. So I don't know what is wrong there.
> Any idea
> > > > or suggestion? Thanks.
> > > >
> > >
> >
>

Reply via email to