xbaran commented on a change in pull request #526: HIVE-21218: KafkaSerDe
doesn't support topics created via Confluent
URL: https://github.com/apache/hive/pull/526#discussion_r254983960
##########
File path:
kafka-handler/src/test/org/apache/hadoop/hive/kafka/AvroBytesConverterTest.java
##########
@@ -0,0 +1,70 @@
+package org.apache.hadoop.hive.kafka;
+
+import com.google.common.collect.Maps;
+import io.confluent.kafka.schemaregistry.client.MockSchemaRegistryClient;
+import io.confluent.kafka.serializers.KafkaAvroSerializer;
+import org.apache.avro.Schema;
+import org.apache.hadoop.hive.serde2.avro.AvroGenericRecordWritable;
+import org.junit.Assert;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+import java.util.Map;
+
+/**
+ * Created by Milan Baran on 1/29/19 15:03.
+ */
+public class AvroBytesConverterTest {
+
+ private static SimpleRecord simpleRecord1 =
SimpleRecord.newBuilder().setId("123").setName("test").build();
+ private static byte[] simpleRecord1AsBytes;
+
+ /**
+ * Emulate confluent avro producer that add 4 magic bits (int) before
value bytes. The int represents the schema ID from schema registry.
+ */
+ @BeforeClass
+ public static void setUp() {
+ Map<String,String> config = Maps.newHashMap();
+ config.put("schema.registry.url","http://localhost");
+ KafkaAvroSerializer avroSerializer = new KafkaAvroSerializer(new
MockSchemaRegistryClient());
+ avroSerializer.configure(config, false);
Review comment:
Yea, good idea. Right now as @b-slim said keys are treated as bytes. But it
this case we need to add a lot of configuration into actual kafka table
properties.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services