Hi, I have read confluent kafka connect hdfs <http://docs.confluent.io/2.0.0/connect/connect-hdfs/docs/index.html> but I don't want to use schema registry from confluent.
I have produced avro encoded bytes to kafka, at that time, I have written my own avro serializer, not used KafkaAvroSerializer <https://github.com/confluentinc/schema-registry/blob/master/avro-serializer/src/main/java/io/confluent/kafka/serializers/KafkaAvroSerializer.java> which seems to be related closely to Schema registry concept from confluent. Now, I want to save my avro encoded from kafka to parquet on hdfs using Avro schema which is located in the classpath, for instance, /META-INF/avro/xxx.avsc. Any idea to write parquet sink? - Kidong Lee.