Re: Is there a way to prevent duplicate messages to downstream

2019-12-10 Thread Alex Brekken
I've never used that dedup transformer before, but what you've got looks right. (though if there's a way to hash your message value, or somehow get a guid out of it that might be preferable) As you probably noticed it's state is Windowed - so if your use-case depends on being able to remove duplic

Re: Is there a way to prevent duplicate messages to downstream

2019-12-10 Thread Sachin Mittal
Hi Alex, Thanks for the quick response. What I have is around 8 streams branched from a single stream, that down the line again gets joined into 1. Now each branched stream can have duplicates and when joining all this data I just have kind of endless tuples of data. So what I was thinking what if

Re: Is there a way to prevent duplicate messages to downstream

2019-12-10 Thread Alex Brekken
Hi Sachin, is your goal to prevent any records with a duplicate key from ever getting sent downstream? The KTable you have in your example will of course have the most recent record for a given key, but it will still emit updates. So if key "A" arrives a second time (with no change to the value),