I am using Kafka Connect in source mode i.e. using it to send events to Kafka 
topics.

With the key.converter and value.converter properties set to 
org.apache.kafka.connect.storage.StringConverter I can attach a consumer to the 
topics and see the events in a readable form.  This is helpful and reassuring 
but it is not the desired representation for my downstream consumers - these 
require the events to be Avro encoded.

It seems that to write the events to Kafka Avro encoded, these properties need 
to be set to io.confluent.kafka.serializers.KafkaAvroSerializer.  Is this 
correct?

I am not using the Confluent platform, merely the standard Kafka 10 download, 
and have been unable to find out how to get at these from a Maven repository 
jar.  http://docs.confluent.io/3.0.0/app-development.html#java suggest that 
these are available via:

               <dependency>
         <groupId>io.confluent</groupId>
         <artifactId>kafka-avro-serializer</artifactId>
         <version>3.0.0</version>
     </dependency>

But it doesn't appear to be true.  The class exists in 
https://raw.githubusercontent.com/confluentinc/schema-registry/master/avro-converter/src/main/java/io/confluent/connect/avro/AvroConverter.java
 but this seems to use the Schema Registry which is something I'd rather avoid.

I'd be grateful for any pointers on the simplest way of getting Avro encoded 
events written to Kafka from a Kafka Connect source connector/task.

Also in the task which creates SourceRecords, I'm choosing Schema.BYTES_SCHEMA 
for the 4th arg in the constructor.  But I'm not clear what this achieves - 
some light shed on that would also be helpful.

Many thanks,
David

Reply via email to