Vineet, How are you looking at number of events in kafka. Did you
checked storm worker logs for any errors and what you mean by "the
acknowledgement of 190 million events in storm" are you looking at
number of acked messages? -Harsha


On Sun, Feb 15, 2015, at 04:40 AM, Vineet Mishra wrote:
> Hi All,
>
> I am having a Kafka Storm Topology which is ingesting events published
> to Kafka and processing on top of that data.
>
> Although apart from some latency I found that everything was going
> good. But recently I came across a issue which I couldn't get any
> solution yet.
>
> I publishing some events from Logstash to Kafka and which is being
> Subscribed by Storm Topology for further processing, I could see that
> the source record count and the events processed by Storm is varying
> by a reasonable number. So I have around 200 Million events to be
> processed out of which 10 Million Events are getting lost as I could
> see the acknowledgement of 190 Million events in the Storm.
>
> Stuck at this issue, looking for expert advise.
>
> Thanks!

Reply via email to