[ 
https://issues.apache.org/jira/browse/FLINK-14108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16939901#comment-16939901
 ] 

 Lasse Nedergaard commented on FLINK-14108:
-------------------------------------------

Sure. I'm on the road next week and the week after to Flink Forward. I will try 
to find the time for it.

> Support for Confluent Kafka schema registry for Avro serialisation 
> -------------------------------------------------------------------
>
>                 Key: FLINK-14108
>                 URL: https://issues.apache.org/jira/browse/FLINK-14108
>             Project: Flink
>          Issue Type: New Feature
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>    Affects Versions: 1.10.0
>            Reporter:  Lasse Nedergaard
>            Assignee:  Lasse Nedergaard
>            Priority: Minor
>
> The current implementation in flink-avro-confluent-registry support 
> deserialization with schema lookup in Confluent Kafka schema registry. 
> I would like support for serialization as well, following the same structure 
> as deserialization. With the feature it would be possible to use Confluent 
> schema registry in Sink writing Avro to Kafka and at the same time register 
> the schema used.
> The test in TestAvroConsumerConfluent need to be updated together with the 
> comment as it indicate it use Confluent schema registry for write, but the 
> example code use SimpleStringSchema.
> We have a running version, that we would like to give back to the community.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to