Hi,
Are there any standard benchmarking results available for Kafka Streams ? I am
specifically looking for stateful operations and the cost of
serialization/deserialization which could be a limiting factor if the state per
key can be large and there are frequent updates. What is the typical si
Is the kafka for testing purposes?
Although it is cross platform, the best is still under linux so that os
buffer & cache can be efficient.
If you can, you could try to run under WSL2. See a tech blog here:
https://www.confluent.io/blog/set-up-and-run-kafka-on-windows-linux-wsl-2/
Lastly, if you
-- Forwarded message -
보낸사람: hello
Date: 2021년 6월 9일 (수) 오후 2:50
Subject: Kafkabroker log swap writing error
To: users-sc.1623217635.lobeokoidbfekkpndnbk-leepeter2019=
gmail@kafka.apache.org
Whenever I run Kafka server in windows 10, it failed to rename
controller.log to con
Sorry Ivan and Garmes,
I misunderstood the suggestion earlier. I think this will be a great idea
for a KIP.
https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Improvement+Proposals
You were referring to metadata for the actual topic and not its contents.
Sorry about that confusion.
On M
Hi,
Having metadata for topics seems pretty useful. Currently, one has to use
external storage for this (e.g. a database) and the question of keeping
topic and metadata in sync exists: A topic is deleted, how to delete its
metadata? How to deal with delete-then-recreate scenarios (well, we have
to
Garmes,
I had similar questions in the past but @Matthias J. Sax
pointed
me to this
https://cwiki.apache.org/confluence/display/KAFKA/KIP-244%3A+Add+Record+Header+support+to+Kafka+Streams+Processor+API
With the headers, you can filter based on the header content and not just
the contents of the
Andrew, an offset implies a partition -- an offset is only meaningful
within the context of a particular partition -- so if you are able to log
offsets you should also be able to log the corresponding partitions.
For example, the RecordMetadata object, which provides the offset of a
written record
Hello all,
I am looking to add a "resend" option to my program where a user can specify an
older message they would like to produce through Kafka again, for whatever
reason. I can get the topic and offset for each message from my logs after
consuming a message, but I do not see a way to get whi