If you can't see the image, I uploaded it to dropbox
https://www.dropbox.com/s/gckn4gt7gv26l9w/graph.png
From: Guy Doulberg [mailto:guy.doulb...@perion.com]
Sent: Monday, August 11, 2014 4:58 PM
To: users@kafka.apache.org
Subject: RE: Consume more than produce
Hey
I had an issue i
ugust 04, 2014 2:12 PM
To: users@kafka.apache.org
Subject: RE: Consume more than produce
Hi Daniel
I count once when producing and count once when consuming, the timestamp is
calculated once before producing, and it is being attached to the msg so the
consumer will use the same TS to count
]
Sent: Monday, August 04, 2014 12:35 PM
To: users@kafka.apache.org
Subject: Re: Consume more than produce
Hi Guy
In your reconciliation, where was the time stamp coming from? Is it possible
that messages were delivered several times but your calculations only counted
each unique event?
Daniel
ducer ACK value?
>
> In my code I don't have a retry mechanism, the Kafka producer API has a retry
> mechanism?
>
>
> -Original Message-
> From: Guozhang Wang [mailto:wangg...@gmail.com]
> Sent: Friday, August 01, 2014 6:08 PM
> To: users@kafka.apache.org
>
nsume more than produce
What is the ack value used in the producer?
On Fri, Aug 1, 2014 at 1:28 AM, Guy Doulberg
wrote:
> Hey,
>
>
> After a year or so I have Kafka as my streaming layer in my
> production, I decided it is time to audit, and to test how many events
> do I lo
Do you have producer retries (due to broker failure) in those minutes when
you see a diff?
Thanks,
Jun
On Fri, Aug 1, 2014 at 1:28 AM, Guy Doulberg
wrote:
> Hey,
>
>
> After a year or so I have Kafka as my streaming layer in my production, I
> decided it is time to audit, and to test how many
What is the ack value used in the producer?
On Fri, Aug 1, 2014 at 1:28 AM, Guy Doulberg
wrote:
> Hey,
>
>
> After a year or so I have Kafka as my streaming layer in my production, I
> decided it is time to audit, and to test how many events do I lose, if I
> lose events at all.
>
>
> I discove
You have to remember statsd uses udp and possibly lossy which might account
for the errors.
-Steve
On Fri, Aug 1, 2014 at 1:28 AM, Guy Doulberg
wrote:
> Hey,
>
>
> After a year or so I have Kafka as my streaming layer in my production, I
> decided it is time to audit, and to test how many event
Hey,
After a year or so I have Kafka as my streaming layer in my production, I
decided it is time to audit, and to test how many events do I lose, if I lose
events at all.
I discovered something interesting which I can't explain.
The producer produces less events that the consumer group con