One reason I am doing this is that when consumer is not running for a long time and the backlog builds up (milions of messages), it can process the high priority messages promptly after restart. This approach is fine during normal operation though, thank you.

On 08/24/2018 04:56 PM, Ryanne Dolan wrote:
Instead of using two topics, consider adding a priority field to your
records, and then use a priority queue in the consumers. For example, each
consumer can have two queues, one for high and one for low priority
records, and can process the low priority queue only when the high priority
queue is empty. That would be essentially what you are describing, but with
a single Kafka topic. Just be careful not to commit offsets of any queued
records until after they are processed.

Ryanne

On Fri, Aug 24, 2018, 3:55 AM Oliver Kindernay <kinder...@dispecer.sk>
wrote:

Hello,

I need to parallel-process messages with per key ordering guarantees so
I have kafka topic with keyed messages and multiple partitions. The need
has arised to process some messages with higher priority, so I thought
of adding a second topic where the messages with higher priority will be
sent.

The consumer maintains per key state in memory and also publishes
changes to compacted topic. Now, if I add the second topic for the
higher priority messages, is there some way to guarantee that when one
instance of consumer subscribes to both topics, it will get all messages
for some particular key, from both topics? I want to avoid the situation
when message for some particular key arrive at one consumer from the low
priority topic, and another consumer from the high priority topic.

Hope its clear, thank you





Reply via email to