RE: Additional metadata available for Kafka serdes

2024-03-20 Thread David Radley
To: dev@flink.apache.org Subject: [EXTERNAL] Re: Additional metadata available for Kafka serdes Hi David! I think passing the headers as a map (as opposed to ConsumerRecord/ProducerRecord) is a great idea that should work. That way the core Flink package doesn't have Kafka dependencies, it seems like

Re: Additional metadata available for Kafka serdes

2024-03-14 Thread Balint Bene
Hi David! I think passing the headers as a map (as opposed to ConsumerRecord/ProducerRecord) is a great idea that should work. That way the core Flink package doesn't have Kafka dependencies, it seems like they're meant to be decoupled anyway. The one bonus that using the Record objects has is

Re: Additional metadata available for Kafka serdes

2024-03-14 Thread David Radley
Hi , I am currently prototyping an Avro Apicurio format that I hope to raise as a FLIP very soon (hopefully by early next week). In my prototyping , I am passing through the Kafka headers content as a map to the DeserializationSchema and have extended the SerializationSchema to pass back