mjsax commented on code in PR #18488:
URL: https://github.com/apache/kafka/pull/18488#discussion_r1917261628
##########
docs/streams/developer-guide/processor-api.html:
##########
@@ -67,42 +67,40 @@ <h2><a class="toc-backref" href="#id1">Overview</a><a
class="headerlink" href="#
<p><b>Tip</b></p>
<p class="last"><strong>Combining the DSL and the Processor
API:</strong>
You can combine the convenience of the DSL with the power
and flexibility of the Processor API as described in the
- section <a class="reference internal"
href="dsl-api.html#streams-developer-guide-dsl-process"><span class="std
std-ref">Applying processors and transformers (Processor API
integration)</span></a>.</p>
+ section <a class="reference internal"
href="dsl-api.html#streams-developer-guide-dsl-process"><span class="std
std-ref">Applying processors (Processor API integration)</span></a>.</p>
</div>
<p>For a complete list of available API functionality, see the <a
href="/{{version}}/javadoc/org/apache/kafka/streams/package-summary.html">Streams</a>
API docs.</p>
</div>
<div class="section" id="defining-a-stream-processor">
<span id="streams-developer-guide-stream-processor"></span><h2><a
class="toc-backref" href="#id2">Defining a Stream Processor</a><a
class="headerlink" href="#defining-a-stream-processor" title="Permalink to this
headline"></a></h2>
- <p>A <a class="reference internal"
href="../core-concepts.html#streams_processor_node"><span class="std
std-ref">stream processor</span></a> is a node in the processor topology that
represents a single processing step.
- With the Processor API, you can define arbitrary stream
processors that processes one received record at a time, and connect
- these processors with their associated state stores to compose
the processor topology.</p>
- <p>You can define a customized stream processor by implementing
the <code class="docutils literal"><span class="pre">Processor</span></code>
interface, which provides the <code class="docutils literal"><span
class="pre">process()</span></code> API method.
- The <code class="docutils literal"><span
class="pre">process()</span></code> method is called on each of the received
records.</p>
- <p>The <code class="docutils literal"><span
class="pre">Processor</span></code> interface also has an <code class="docutils
literal"><span class="pre">init()</span></code> method, which is called by the
Kafka Streams library during task construction
- phase. Processor instances should perform any required
initialization in this method. The <code class="docutils literal"><span
class="pre">init()</span></code> method passes in a <code class="docutils
literal"><span class="pre">ProcessorContext</span></code>
- instance, which provides access to the metadata of the
currently processed record, including its source Kafka topic and partition,
- its corresponding message offset, and further such
information. You can also use this context instance to schedule a punctuation
- function (via <code class="docutils literal"><span
class="pre">ProcessorContext#schedule()</span></code>), to forward a new record
as a key-value pair to the downstream processors (via <code class="docutils
literal"><span class="pre">ProcessorContext#forward()</span></code>),
- and to commit the current processing progress (via <code
class="docutils literal"><span
class="pre">ProcessorContext#commit()</span></code>).
- Any resources you set up in <code class="docutils literal"><span
class="pre">init()</span></code> can be cleaned up in the
- <code class="docutils literal"><span
class="pre">close()</span></code> method. Note that Kafka Streams may re-use a
single
- <code class="docutils literal"><span
class="pre">Processor</span></code> object by calling
- <code class="docutils literal"><span
class="pre">init()</span></code> on it again after <code class="docutils
literal"><span class="pre">close()</span></code>.</p>
- <p>
- The <code class="docutils literal"><span
class="pre">Processor</span></code> interface takes two sets of generic
parameters:
+ <p>A <a class="reference internal"
href="../core-concepts.html#streams_processor_node"><span class="std
std-ref">stream processor</span></a> is a node in the processor topology that
represents a single processing step.
+ With the Processor API, you can define arbitrary stream processors
that processes one received record at a time, and connect
+ these processors with their associated state stores to compose the
processor topology.</p>
+ <p>You can define a customized stream processor by implementing the
<code class="docutils literal"><span class="pre">Processor</span></code>
interface, which provides the <code class="docutils literal"><span
class="pre">process()</span></code> API method.
+ The <code class="docutils literal"><span
class="pre">process()</span></code> method is called on each of the received
records.</p>
+ <p>The <code class="docutils literal"><span
class="pre">Processor</span></code> interface also has an <code class="docutils
literal"><span class="pre">init()</span></code> method, which is called by the
Kafka Streams library during task construction
+ phase. Processor instances should perform any required
initialization in this method. The <code class="docutils literal"><span
class="pre">init()</span></code> method passes in a <code class="docutils
literal"><span class="pre">ProcessorContext</span></code>
+ instance, which provides access to the metadata of the currently
processed record, including its source Kafka topic and partition,
+ its corresponding message offset, and further such information.
You can also use this context instance to schedule a punctuation
+ function (via <code class="docutils literal"><span
class="pre">ProcessorContext#schedule()</span></code>), to forward a new record
to the downstream processors (via <code class="docutils literal"><span
class="pre">ProcessorContext#forward()</span></code>),
Review Comment:
It was not a link before. Think it's ok as-is?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]