Hi everyone

I have design an integration between 2 systems throug our API Stream Kafka,
and the requirements are unclear to choose properly the number of
partitions/topics.

That is the use case:

My producer will send 28 different type of events, so I have decided to
create 28 topics.

The max size value for one message will be 4,096 bytes and the total size
(MB/day) will be 2.469,888 mb/day.

The retention will be 2 days.

By default I´m thinking in one partition that as recomentation by confluent
it can produce 10 Mb/second.

However the requirement for the consumer is the minimun latency (sub 3
seconds), so I thinking to create more leader partitions/per topic to
paralle and achive the thoughput.

Do you know what is the best practice or formule to define it properly?

Thanks

Reply via email to