It's a known issue that various connectors/wrappers/etc did not respect the schema lifecycle.

This was fixed in 1.16.0 in https://issues.apache.org/jira/browse/FLINK-28807.

You will have to lazily initialize the mapper in the serialize() method for previous versions.

On 24/01/2023 11:52, Peter Schrott wrote:
Hi Flink-User!


I recently updated a Flink job from Flink version 1.13 to 1.15 (managed by AWS). The Flink Job is written in Java.

I found out that the Kinesis Producer was deprecated in favour of Kinesis Streams Sink [1]. When upgrading to the new sink I stumbled upon a problem withe a custom Serialisation Schema. I am using a custom implementation of the Serialisation Schema to deserialize result POJOs to JSON using Jacksons Object Mapper. This Object Mapper is initialised and set up in the open() method of the Serialisation Schema. The problem is, that this open method is not call intially.

I have not found any but report or indications towards this issue. Is this known or am I just “holding it wrong” (aka missing something)?

I created a minimal reproducible on my GitHub repo: https://github.com/peterschrott/flink-sink-open


Best & Thanks,
Peter


[1] https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/datastream/kinesis/#kinesis-producer

Reply via email to