Hi guys
I am having Kafka topic of Films having 2000 data.
In spring boot KafkaListener I am listening that particular topic
But I need to process the every data one at a time after that only I need
to consume the next record..
How can I overcome this scenario?
Any idea for this scenario?
Thanks John.
Actually, this is a normal consumer-producer application wherein there are
2 consumers (admin consumer and main consumer) consuming messages from 2
different topics.
One of the consumers consumes messages from a admin topic and populates
data in a cache e.g. lets say agent with agent
Hi
I have written a Python consumer using confluent-kafka package. After few hours
of running the consumer is dying with the below error
cimpl.KafkaException:
KafkaError{code=_TIMED_OUT,val=-185,str="FindCoordinator response
error: Local: Timed out"}
Can anyone please help me understand why
Hi Pushkar,
I’ve been wondering if we should add writable tables to the Streams api. Can
you explain more about your use case and how it would integrate with your
application?
Incidentally, this would also help us provide more concrete advice.
Thanks!
John
On Fri, May 1, 2020, at 15:28,
Hi,
It's the client that closes the connection, you can see some
disconnectExceptions in the client log, however it would be nice if there was a
log: "Closing connection due to request timeout" somwhere. This would save me
some time parsing wireshark logs, though when you look at that