Just for a Kafka source:

https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema


   - There is also a version of this schema available that can lookup the
   writer’s schema (schema which was used to write the record) in Confluent
   Schema Registry
   <https://docs.confluent.io/current/schema-registry/docs/index.html>.
   Using these deserialization schema record will be read with the schema that
   was retrieved from Schema Registry and transformed to a statically
   provided( either through
   ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or
   ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).


On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard <lassenederga...@gmail.com>
wrote:

> Hi.
> Do Flink have out of the Box Support for Kafka Schema registry for both
> sources and sinks?
> If not, does anyone knows about a implementation we can build on so we can
> help make it general available in a future release.
>
> Med venlig hilsen / Best regards
> Lasse Nedergaard
>
>

Reply via email to