Need help when using kubernetes.jobmanager.annotations

2021-07-27 Thread Manong Karl
Hi Team, I have set the "kubernetes.jobmanager.annotations". But I can't
find these in the k8s deployment. As these can be found in the job manager
pod.
Is it by design or just be missed?


Re: Flink k8 HA mode + checkpoint management

2021-08-03 Thread Manong Karl
Can You please share your configs? I'm using native kubernetes without HA
and there's no issues. I'm curious how this happens.  AFAIK jobid is
generated randomly.


Harsh Shah  于2021年8月4日周三 上午2:44写道:

> Hello,
>
> I am trying to use Flink HA mode inside kubernetes
> 
>  in standalone
> 
>  mode.
> The Job ID is always constant, "". In
> situation where we restart the job (Not from a check-point or savepoint),
> we see errors like
> """
>
> Caused by: org.apache.hadoop.fs.FileAlreadyExistsException: 
> '/flink-checkpoints//chk-1/_metadata' 
> already exists
>
> """
> where checkpoints have not been created since the restart of Job .
>
> My question:
> * Is the recommended way to set a new unique "checkpoint path" every time
> we update Job and restart necessary k8 resources (say not restarted from
> checkpoint-savepoint)? Or GC checkpoints during deletion and reload from
> savepoint if required? Looking for a standard recommendation.
> * Is there a way I can override the JobID to be unique and indicate it is
> a complete restart in HA mode?
>
>
> Thanks,
> Harsh
>


Re: Got NoClassDefFoundError on Flink native K8S cluster

2021-08-21 Thread Manong Karl
Iceberg v0.11 or v0.12 not capable with flink v1.13.x.

L. C. Hsieh  于2021年8月21日周六 下午3:52写道:

> Hi, I'm testing using Flink to write Iceberg table. I run Flink native K8S
> cluster locally and submit a simple Java program that writes out Iceberg
> table (https://github.com/spancer/flink-iceberg-demo). But got an
> exception:
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at
> org.apache.iceberg.flink.FlinkCatalogFactory.clusterHadoopConf(FlinkCatalogFactory.java:148)
> at
> org.apache.iceberg.flink.TableLoader.fromHadoopTable(TableLoader.java:49)
> at
> com.coomia.iceberg.test.IcebergReadWriteTest.main(IcebergReadWriteTest.java:89)
>
> The uploaded is a fat jar. I also checked the uploaded application jar. It
> has the Configuration class. So I don't know what is wrong there. Any idea
> or suggestion? Thanks.
>


Re: Got NoClassDefFoundError on Flink native K8S cluster

2021-08-22 Thread Manong Karl
I prefer using flink bundled hadoop, such as
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar.
May help.

L. C. Hsieh  于2021年8月22日周日 上午1:40写道:

>
> BTW, I checked dependency tree, the flink-iceberg demo only has one Hadoop
> common dependency. So I'm not sure why Flink throws such exception. Based
> on Flink doc, I suppose that Flink binary doesn't include Hadoop
> dependencies, right?
>
> Based on the exception, looks like when FlinkCatalogFactory (from Iceberg)
> calls HadoopUtils.getHadoopConfiguration (from Flink), their classloaders
> are different and referring to different Hadoop Configuration Class objects.
>
> I'm not familiar with Flink. So I'm wondering what step is wrong during
> the testing? It is a pretty simple test to verify Iceberg and Flink.
>
> On 2021/08/21 08:50:05, L. C. Hsieh  wrote:
> >
> > Thanks for replying.
> >
> > I'm using Flink 1.12.x. And I think Iceberg 0.12 uses Flink 1.12
> actually.
> >
> > Once I upgraded the Iceberg from 0.11.0 to 0.12.0 for the Java
> application. I got new exception as below:
> >
> > java.lang.LinkageError: loader constraint violation: when resolving
> method
> "org.apache.flink.runtime.util.HadoopUtils.getHadoopConfiguration(Lorg/apache/flink/configuration/Configuration;)Lorg/apache/hadoop/conf/Configuration;"
> the class loader (instance of org/apache/flink/util/ChildFirstClassLoader)
> of the current class, org/apache/iceberg/flink/FlinkCatalogFactory, and the
> class loader (instance of sun/misc/Launcher$AppClassLoader) for the
> method's defining class, org/apache/flink/runtime/util/HadoopUtils, have
> different Class objects for the type org/apache/hadoop/conf/Configuration
> used in the signature at
> org.apache.iceberg.flink.FlinkCatalogFactory.clusterHadoopConf(FlinkCatalogFactory.java:152)
> >
> >
> > On 2021/08/21 08:11:33, Manong Karl  wrote:
> > > Iceberg v0.11 or v0.12 not capable with flink v1.13.x.
> > >
> > > L. C. Hsieh  于2021年8月21日周六 下午3:52写道:
> > >
> > > > Hi, I'm testing using Flink to write Iceberg table. I run Flink
> native K8S
> > > > cluster locally and submit a simple Java program that writes out
> Iceberg
> > > > table (https://github.com/spancer/flink-iceberg-demo). But got an
> > > > exception:
> > > >
> > > > java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
> at
> > > >
> org.apache.iceberg.flink.FlinkCatalogFactory.clusterHadoopConf(FlinkCatalogFactory.java:148)
> > > > at
> > > >
> org.apache.iceberg.flink.TableLoader.fromHadoopTable(TableLoader.java:49)
> > > > at
> > > >
> com.coomia.iceberg.test.IcebergReadWriteTest.main(IcebergReadWriteTest.java:89)
> > > >
> > > > The uploaded is a fat jar. I also checked the uploaded application
> jar. It
> > > > has the Configuration class. So I don't know what is wrong there.
> Any idea
> > > > or suggestion? Thanks.
> > > >
> > >
> >
>