To: dev@flink.apache.org
Subject: [EXTERNAL] Re: Additional metadata available for Kafka serdes
Hi David!
I think passing the headers as a map (as opposed to
ConsumerRecord/ProducerRecord) is a great idea that should work. That way
the core Flink package doesn't have Kafka dependencies, it seems like
Hi David!
I think passing the headers as a map (as opposed to
ConsumerRecord/ProducerRecord) is a great idea that should work. That way
the core Flink package doesn't have Kafka dependencies, it seems like
they're meant to be decoupled anyway. The one bonus that using the Record
objects has is
Hi ,
I am currently prototyping an Avro Apicurio format that I hope to raise as a
FLIP very soon (hopefully by early next week). In my prototyping , I am
passing through the Kafka headers content as a map to the DeserializationSchema
and have extended the SerializationSchema to pass back