I fixated on using the key/value deserializer classes in the consumer
properties.  Overloading the consumer constructor is the way to enable
schema caching:

CachedSchemaRegistryClient cachedSchemaRegistryClient = new
CachedSchemaRegistryClient("registry_url", 1000);
KafkaAvroDeserializer kafkaAvroDeserializer = new
KafkaAvroDeserializer(cachedSchemaRegistryClient);
StringDeserializer stringDeserializer = new StringDeserializer();

final KafkaConsumer consumer = new KafkaConsumer(consumerProps,
stringDeserializer , kafkaAvroDeserializer);

In Streams, there is a similar overload for addSource:

TopologyBuilder addSource(String name, Deserializer keyDeserializer,
Deserializer valDeserializer, String... topics)

Kris


On Tue, Oct 3, 2017 at 4:34 PM, Svante Karlsson <svante.karls...@csi.se>
wrote:

> I've implemented the same logic for a c++ client - caching is the only way
> to go since the performance impact of not doing it would be to big. So bet
> on caching on all clients.
>
> 2017-10-03 18:12 GMT+02:00 Damian Guy <damian....@gmail.com>:
>
> > If you are using the confluent schema registry then the will be cached by
> > the SchemaRegistryClient.
> >
> > Thanks,
> > Damian
> >
> > On Tue, 3 Oct 2017 at 09:00 Ted Yu <yuzhih...@gmail.com> wrote:
> >
> > > I did a quick search in the code base - there doesn't seem to be
> caching
> > as
> > > you described.
> > >
> > > On Tue, Oct 3, 2017 at 6:36 AM, Kristopher Kane <kkane.l...@gmail.com>
> > > wrote:
> > >
> > > > If using a Byte SerDe and schema registry in the consumer configs of
> a
> > > > Kafka streams application, does it cache the Avro schemas by ID and
> > > version
> > > > after fetching from the registry once?
> > > >
> > > > Thanks,
> > > >
> > > > Kris
> > > >
> > >
> >
>

Reply via email to