Re: Kafka-Real Time Update

2021-12-30 Thread Dave Klein
That really was a helpful overview, Israel. Might make a good blog post! 😀 Ola, C# would make it so that you can’t use Kafka Streams, but you may not need it. The Kafka Consumer API, which is available in C#, might be enough for you. For a good explanation of topics, partitions, and pretty m

Re: uneven distribution of events across kafka topic partitions for small number of unique keys

2021-11-22 Thread Dave Klein
sser partitions, the > issue is about the duplicate hash caused by default partitioner for 2 > different string, which might be landing the 2 different keys into same > partition > >> On Sun, Nov 21, 2021 at 9:33 PM Dave Klein wrote: >> >> Another possibility, if you c

Re: uneven distribution of events across kafka topic partitions for small number of unique keys

2021-11-21 Thread Dave Klein
Another possibility, if you can pause processing, is to create a new topic with the higher number of partitions, then consume from the beginning of the old topic and produce to the new one. Then continue processing as normal and all events will be in the correct partitions. Regards, Dave > On

Re: Kafa Streams

2021-06-26 Thread Dave Klein
Yes, Kafka Consumer and Kafka Streams are just libraries. My point with that, is that it’s not difficult to switch from one to the other as your needs evolve. There are several ways that Kafka Streams aids in processing. It provides a rich set of functions for transforming, filtering, branchin

Re: Kafa Streams

2021-06-25 Thread Dave Klein
If you are not doing any transformation or filtering of the data before writing it to the db, then you’re probably better off with a consumer or Kafka Connect. Kafka Streams shines when you have processing to do on the data as it is consumed. Especially if you do any stateful transformations, s

Re: Advice for Kafka project in Africa...

2020-09-08 Thread Dave Klein
I’m no expert, but I think Kafka, Kafka Connect and Kafka Streams could be a great fit for your use case. Use Kafka Connect to pull data from the various data sources, into a topic or topics and then use Kafka Streams to do the joins, enrichment, and / or analysis needed to determine a record