[ https://issues.apache.org/jira/browse/FLINK-20379?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Martijn Visser resolved FLINK-20379. ------------------------------------ Resolution: Fixed This ticket was still open but the PR was merged. I've adjusted the fixVersion to 1.13.0. Commit: d408241eebc5cc9743823598860ede71026e9306 > Update KafkaRecordDeserializationSchema to enable reuse of > DeserializationSchema and KafkaDeserializationSchema > --------------------------------------------------------------------------------------------------------------- > > Key: FLINK-20379 > URL: https://issues.apache.org/jira/browse/FLINK-20379 > Project: Flink > Issue Type: Bug > Components: Connectors / Kafka > Affects Versions: 1.12.0 > Reporter: Stephan Ewen > Priority: Major > Labels: auto-deprioritized-critical, auto-unassigned, > pull-request-available > Fix For: 1.13.0 > > > The new Kafka Connector defines its own deserialization schema and is > incompatible with the existing library of deserializers. > That means that users cannot use all of Flink's Formats (Avro, JSON, Csv, > Protobuf, Confluent Schema Registry, ...) with the new Kafka Connector. > I think we should change the new Kafka Connector to use the existing > Deserialization classes, so all formats can be used, and users can reuse > their deserializer implementations. > It would also be good to use the existing KafkaDeserializationSchema. > Otherwise all users need to migrate their sources again. -- This message was sent by Atlassian Jira (v8.20.1#820001)