A kafka broker never pushes data to a consumer. It's the consumer that does
a long fetch and it provides the offset to read from.

The problem lies in how your consumer handles the for example 1000 messages
that it just got. If you handle 500 of them and crash without committing
the offsets somewhere (either to kafka or in some other system). When you
restart the you will start your fetch again from the last committed offset.
Kafka has no notion of an already consumed message.



2015-01-23 7:54 GMT+01:00 Tousif <tousif.pa...@gmail.com>:

> Hi,
>
> i want know in which situation does kafka send same event  multiple times
> to consumer. Is there a consumer side configuration to tell kafka to send
> only once and stop retries?
>
> --
>
>
> Regards
> Tousif Khazi
>

Reply via email to