This is an automated email from the ASF dual-hosted git repository.

schofielaj pushed a commit to branch trunk
in repository https://gitbox.apache.org/repos/asf/kafka.git


The following commit(s) were added to refs/heads/trunk by this push:
     new 0ddc69da709 KAFKA-19721: Update streams documentation with KIP-1147 
changes (#20606)
0ddc69da709 is described below

commit 0ddc69da7094fcc4163b6beb1056fc46a8e15a68
Author: Jhen-Yung Hsu <[email protected]>
AuthorDate: Thu Oct 2 03:17:47 2025 +0800

    KAFKA-19721: Update streams documentation with KIP-1147 changes (#20606)
    
    Update KIP-1147 changes (renaming --property to --formatter-property) in
    the ops and streams documentation.
    
    Reviewers: Andrew Schofield <[email protected]>
---
 docs/ops.html                               |  2 +-
 docs/streams/developer-guide/datatypes.html | 14 ++++++-------
 docs/streams/quickstart.html                | 32 ++++++++++++++---------------
 3 files changed, 24 insertions(+), 24 deletions(-)

diff --git a/docs/ops.html b/docs/ops.html
index 2d050ec76da..fcef480f2c9 100644
--- a/docs/ops.html
+++ b/docs/ops.html
@@ -4315,7 +4315,7 @@ $ bin/kafka-topics.sh --create --topic tieredTopic 
--bootstrap-server localhost:
 
 <p>Lastly, we can try to consume some data from the beginning and print offset 
number, to make sure it will successfully fetch offset 0 from the remote 
storage.</p>
 
-<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --topic 
tieredTopic --from-beginning --max-messages 1 --bootstrap-server localhost:9092 
--property print.offset=true</code></pre>
+<pre><code class="language-bash">$ bin/kafka-console-consumer.sh --topic 
tieredTopic --from-beginning --max-messages 1 --bootstrap-server localhost:9092 
--formatter-property print.offset=true</code></pre>
 
 <p>In KRaft mode, you can disable tiered storage at the topic level, to make 
the remote logs as read-only logs, or completely delete all remote logs.</p>
 
diff --git a/docs/streams/developer-guide/datatypes.html 
b/docs/streams/developer-guide/datatypes.html
index 2bc2d7d5d0e..6e93de6e1e1 100644
--- a/docs/streams/developer-guide/datatypes.html
+++ b/docs/streams/developer-guide/datatypes.html
@@ -217,15 +217,15 @@ TimeWindowedDeserializer&lt;String&gt; deserializer = new 
TimeWindowedDeserializ
         <h4>Usage in Command Line</h4>
         <p>When using command-line tools (like 
<code>bin/kafka-console-consumer.sh</code>), you can configure windowed 
deserializers by passing the inner class and window size via configuration 
properties. The property names use a prefix pattern:</p>
         <pre class="line-numbers"><code class="language-bash"># Time windowed 
deserializer configuration
---property print.key=true \
---property 
key.deserializer=org.apache.kafka.streams.kstream.TimeWindowedDeserializer \
---property 
key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer
 \
---property key.deserializer.window.size.ms=500
+--formatter-property print.key=true \
+--formatter-property 
key.deserializer=org.apache.kafka.streams.kstream.TimeWindowedDeserializer \
+--formatter-property 
key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer
 \
+--formatter-property key.deserializer.window.size.ms=500
 
 # Session windowed deserializer configuration  
---property print.key=true \
---property 
key.deserializer=org.apache.kafka.streams.kstream.SessionWindowedDeserializer \
---property 
key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer</code></pre>
+--formatter-property print.key=true \
+--formatter-property 
key.deserializer=org.apache.kafka.streams.kstream.SessionWindowedDeserializer \
+--formatter-property 
key.deserializer.windowed.inner.deserializer.class=org.apache.kafka.common.serialization.StringDeserializer</code></pre>
 
         <h4>Deprecated Configs</h4>
         <p>The following <code>StreamsConfig</code> parameters are deprecated 
in favor of passing parameters directly to serializer/deserializer 
constructors:</p>
diff --git a/docs/streams/quickstart.html b/docs/streams/quickstart.html
index 56ab128e2a5..b6bd6a51e63 100644
--- a/docs/streams/quickstart.html
+++ b/docs/streams/quickstart.html
@@ -175,10 +175,10 @@ and inspect the output of the WordCount demo application 
by reading from its out
 <pre><code class="language-bash">$ bin/kafka-console-consumer.sh 
--bootstrap-server localhost:9092 \
     --topic streams-wordcount-output \
     --from-beginning \
-    --property print.key=true \
-    --property print.value=true \
-    --property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
-    --property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer</code></pre>
+    --formatter-property print.key=true \
+    --formatter-property print.value=true \
+    --formatter-property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
+    --formatter-property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer</code></pre>
 
 
 <h4><a id="quickstart_streams_process" href="#quickstart_streams_process">Step 
5: Process some data</a></h4>
@@ -197,10 +197,10 @@ This message will be processed by the Wordcount 
application and the following ou
 <pre><code class="language-bash">$ bin/kafka-console-consumer.sh 
--bootstrap-server localhost:9092 \
     --topic streams-wordcount-output \
     --from-beginning \
-    --property print.key=true \
-    --property print.value=true \
-    --property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
-    --property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
+    --formatter-property print.key=true \
+    --formatter-property print.value=true \
+    --formatter-property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
+    --formatter-property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
 
 all        1
 streams        1
@@ -225,10 +225,10 @@ In your other terminal in which the console consumer is 
running, you will observ
 <pre><code class="language-bash">$ bin/kafka-console-consumer.sh 
--bootstrap-server localhost:9092 \
     --topic streams-wordcount-output \
     --from-beginning \
-    --property print.key=true \
-    --property print.value=true \
-    --property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
-    --property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
+    --formatter-property print.key=true \
+    --formatter-property print.value=true \
+    --formatter-property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
+    --formatter-property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
 
 all        1
 streams        1
@@ -255,10 +255,10 @@ The <b>streams-wordcount-output</b> topic will 
subsequently show the correspondi
 <pre><code class="language-bash">$ bin/kafka-console-consumer.sh 
--bootstrap-server localhost:9092 \
     --topic streams-wordcount-output \
     --from-beginning \
-    --property print.key=true \
-    --property print.value=true \
-    --property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
-    --property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
+    --formatter-property print.key=true \
+    --formatter-property print.value=true \
+    --formatter-property 
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
+    --formatter-property 
value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
 
 all        1
 streams        1

Reply via email to