Hi All,
Any suggestions on how I can achieve this?
Thanks
On Fri, Apr 3, 2020 at 12:49 AM Navneeth Krishnan
wrote:
> Hi Boyang,
>
> Basically I don’t want to load all the states upfront. For local kv store,
> when the very first message arrives I basically do a http request to an
> external se
Hi
I wanna change kafka log to DEBUG level
I change option log4j.rootLogger=DEBUG, stdout, kafkaAppender in
log4j.properties file
but in logs/server.log I still see logs without any DEBUG message. why?
Thank you.
Never mind, I found answer. I had unexpected cron on firing up every 5 minutes
and blast cluster with connections from +2k additional servers.
> On Apr 8, 2020, at 10:46, Jacek Szewczyk wrote:
>
> Hi All,
>
> I am seeing strange behavior for Kafka 2.0.0.3.1.4. My cluster contains 9
> broker
Hi Jacob,
The Kafka code base is huge and the documentation is also very broad. It is
always likely that you will notice discrepancies between the current
implementation for a specific version of Kafka or ecosystem components when
compared to the reference documentation.
If you notice such issues
Hello Apache Kafka team,
comparing the 2.4.1 code state of KafkaProducer with documentation, i noticed
the following difference:
The „send(record, callback)“ method catches internally the apiException’s and
set it into Future-Object.
The callback object handles this exceptions afterwards.
But
Hi Alex,
It sounds like your theory is plausible. After a rebalance, Streams needs to
restore its stores from the changelog topics. Currently, Streams performs this
restore operation in the same loop that does processing and polls the consumer
for more records. If the restore batches (or the pr
Hi all, I’ve got a Kafka Streams application running in a Kubernetes
environment. The topology on this application has 2 aggregations (and
therefore 2 Ktables), both of which can get fairly large – the first is
around 200GB and the second around 500GB. As with any K8s platform, pods
can occasiona
Hi,
In fact an import was missing. It works now:
import org.apache.kafka.streams.Topology
import org.apache.kafka.streams.scala.ImplicitConversions._
import org.apache.kafka.streams.scala.Serdes._
On Thu, 9 Apr 2020 at 11:49, Nicolae Marasoiu <
nicolae.maras...@ovoenergy.com> wrote:
> Hi,
> We'
Hi,
We're using kafka streams to map a topic into another (copy data exchanging
formats). Both are having AVRO values.
I will start with the compile error, and then progress with the code
samples:
could not find implicit value for parameter consumed:
org.apache.kafka.streams.scala.kstream.Consume
**BUMP**
On Tue, Apr 7, 2020 at 10:54 PM nitin agarwal
wrote:
> Hi,
>
> I have a use case where new connectors will keep on adding to existing
> running Kafka Connect cluster. Is there any way in Kafka Connect to submit
> the new connector jar dynamically without restarting the Connect process?
10 matches
Mail list logo