Hello!

I think I know the answer to this, but I thought I would go ahead and ask.

We have a process the emits messages to our stream. These messages can
include duplicates based on a certain key ( we'll call it TheKey). Our
Flink job reads the messages, keys by TheKey and enters a window function.
Right now we are using an EventTime Session Window with a custom
aggregator, which only keeps the most recent messages for each TheKey. It
then emits those messages to a map function that does work based on the
message.

What we would like to have is a window function that keeps building up
until the downstream map function has completed for the windows key, then
emit.

Is this a pattern that Flink supports?

-Steve

Reply via email to