Hi,

My use case includes consuming events for sessions and once an
inactivity gap is over creating a detailed report on them. From the
docs (using 1.0.1 currently) it is not clear what is the best way to
achieve this, it seems  actions like reduce and aggregate create
results with the same type as their inputs and they produce a new
update for each arriving event.

Is this a suitable use case for a kafka streams application? Any
pointers how to create and publish statistics to a topic only once
after all related events grouped by some key arrived? Possibly only
reading the messages from the topic when a session is to be fully
processed?

Thanks

Reply via email to