Hi

Kafka streams sounds like a good solution there.
The first step is to properly partition your event topics, based on the
session key so all events for the same session will goes to the same
partition.
Then you could build your kafka streams application, that will maintains a
state (manually managed or using DSL aggregation functions) with a windows.
So a event will be immediately consumed, but to update the state only, then
you can choose when you want to produce your output event based on the state
You can read about windowing in the official documentation here
https://kafka.apache.org/11/documentation/streams/developer-guide/dsl-api.html#windowing

Best,
Vincent



On Tue, Jul 17, 2018 at 10:09 AM <vb...@hushmail.com> wrote:

> Hi,
>
> My use case includes consuming events for sessions and once an
> inactivity gap is over creating a detailed report on them. From the
> docs (using 1.0.1 currently) it is not clear what is the best way to
> achieve this, it seems  actions like reduce and aggregate create
> results with the same type as their inputs and they produce a new
> update for each arriving event.
>
> Is this a suitable use case for a kafka streams application? Any
> pointers how to create and publish statistics to a topic only once
> after all related events grouped by some key arrived? Possibly only
> reading the messages from the topic when a session is to be fully
> processed?
>
> Thanks
>

Reply via email to