That should work.
Thanks
Eno
> On 12 Jan 2017, at 21:21, Nicolas Fouché wrote:
>
> Thanks Eno !
>
> My intention is to reprocess all the data from the beginning. And we'll
> reset the application as documented in the Confluent blog.
> We don't want to keep the previous
Thanks Eno !
My intention is to reprocess all the data from the beginning. And we'll
reset the application as documented in the Confluent blog.
We don't want to keep the previous results; in fact, we want to overwrite
them. Kafka Connect will happily replace all records in our sink database.
So
Hi Nicolas,
I've seen your previous message thread too. I think your best bet for now is to
increase the window duration time, to 6 months.
If you change your application logic, e.g., by changing the duration time, the
semantics of the change wouldn't immediate be clear and it's worth
Hi.
I'd like to re-consume 6 months old data with Kafka Streams.
My current topology can't because it defines aggregations with windows maintain
durations of 3 days.
TimeWindows.of(ONE_HOUR_MILLIS).until(THREE_DAYS_MILLIS)
As discovered (and shared [1]) a few months ago, consuming a record