Hello Kafka experts
The consumer team is reporting issue while consuming the data from the
topic as Singularity Header issue.
Can some one please tell on how to resolve this issue.
Error looks like ;
Starting offset: 1226716
offset: 1226716 position: 0 CreateTime: 1583780622665 isvalid: true
key
Thanks, David. That is very useful information. Our data does arrive in 1
minute batches - I'll double check the history of the metric over time. If
that is the case, I would expect to see it fluctuate if I poll the metric
at a sub-minute interval
Many thanks,
Marcus
On Wed, 16 Jun 2021, 17:39 D
Hi Marcus,
For fetch requests, if the remote time is high, it could be that there is
not enough data to give in a fetch response. This can happen when the
consumer or replica is caught up and there is no new incoming data. If this
is the case, remote time will be close to the max wait time, which
I've recently implemented further monitoring of our Kafka cluster to hone
in on where I think we have bottlenecks.
I'm interested in one metric in particular:
*kafka.network:type=RequestMetrics,name=RemoteTimeMs,request={Produce|FetchConsumer|FetchFollower}*
All the docs I've seen accompanying the
I am interested in learning/deducing the maximum consumption rate of a Kafka
consumer in my consumer group. Maximum consumption rate is the rate at which
the consumer can not keep up with the message arrival rate, and hence the
consumer will fall farther and farther behind and the message lag wo