glew02 opened a new issue, #1390:
URL: https://github.com/apache/camel-kafka-connector/issues/1390

   Hi All,
   
   When using CAMEL-GOOGLE-PUBSUB-SOURCE-KAFKA-CONNECTOR SOURCE 1.0.0, I've 
noticed that the payload is base64 encoded. Is this expected and is there a 
configuration to get decoded payloads?
   
   Originally I extended CAMEL-GOOGLE-PUBSUB-SOURCE-KAFKA-CONNECTOR SOURCE 
1.0.0 using mvn archetype generate with 
camel-kafka-connector-extensible-archetype to build in a Confluent Avro 
Converter 7.1.0 into the connector, because we want to use Confluent Avro and 
Confluent Schema Registry. This all worked and I was able to read data (not 
base64 encoded) out of the topic using an Avro console consumer. 
   
   The issue is that the schema that's automatically created in schema registry 
looks like:
   ```
   {"subject":"my-topic-value","version":1,"id":1,"schema":"\"bytes\""}
   ```
   What we would want to see is some JSON schema.
   
   So I tried using the Apache Json Converter to see what the data looked like 
and I saw that the data looked like:
   ```
   
{\"type\":\"bytes\",\"optional\":false},\"payload\":\"<BASE64_ENCODE_STRING>\"}"
   ```
   
   So my understanding is that the reason that schema registry is getting a 
schema of bytes is that the data its getting from the connector is a base64 
encoded so it looks like bytes to it. 
   
   I also tried using the Apache ByteArrayConverter which works to get the raw 
data but as expected wouldn't create any schemas with schema registry.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to