Hi Jason,

The topic is used in *FlinkKafkaConsumer*, following the
*KafkaDeserializationSchema* and then *Properties*.

https://ci.apache.org/projects/flink/flink-docs-release-1.9/api/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumer.html

new FlinkKafkaConsumer(kafkaTopic, new MessageDeserializer, kafkaProperties)
...
class MessageDeserializer extends KafkaDeserializationSchema[GenericRecord]
{


On Thu, Jan 23, 2020 at 1:20 AM Jason Kania <jason.ka...@ymail.com> wrote:

> Hello,
>
> I was looking for documentation in 1.9.1 on how to create implementations
> of the KafkaSerializationSchema and KafkaDeserializationSchema
> interfaces. I have created implementations in the past for the
> SerializationSchema and DeserializationSchema interface. Unfortunately, I
> can find no examples and the code contains no documentation for this
> purpose but some information appears missing.
>
> Can someone please answer the following:
>
> 1) When creating a ProducerRecord with the 
> KafkaSerializationSchema.serialize()
> method, how is the topic String supposed to be obtained by the implementing
> class? All of the constructors require that the topic be specified, but the
> topic is not passed in. Is there another interface that should be
> implemented to get the topic or get a callback? Or is expected that the
> topic has to be fixed in the interface's implementation class? Some of the
> constructors also ask for a partition. Again, where is this information
> expected to come from?
>
> 2) The interfaces specify that ConsumerRecord<byte[], byte[]> is received
> and ProducerRecord<byte[], byte[]> is to be generated. What are the 2
> byte arrays referencing in the type definitions?
>
> Thanks,
>
> Jason
>

Reply via email to