I just checked and that patch is in .8 branch. Thanks for working on back
porting it Andrew. We'd be happy to commit that work to master.
As for the kafka contrib project vs Camus, they are similar but not quite
identical. Camus is intended to be a high throughput ETL for bulk
ingestion of Kaf
Hi Vadim,
Sorry for the slow response. If your topics share commonality, you should
be able to implement one decoder to handle all of them. On the other hand
if your kafka data is different depending on the topic, you might need
separate decoders for each topic. I don't recall if we added the a
We can easily make a Camus configuration that would mimic the functionality
of the hadoop consumer in contrib. It may require the addition of a
BinaryWritable decoder, and a couple minor code changes. As for the
producer, we don't have anything in Camus that does what it does. But
maybe we shoul