Hi Guozhang,

sorry, by "app" i mean the stream processor app, the one shown in
pipeline.kt.

The app reads a topic of data sent by a sensor each second and generates a
20 second window output to another topic.
My "problem" is that when running locally with my local kafka setup, let's
say I stop it and start it again, it continues processing the last window.
When deploying the app into a docker container and using the confluent
cloud as broker, every time I restart the app it starts processing again
from the beginning of the input topic and generates again old windows it
already processed.

In the meantime I'm trying to upgrade to kafka 2.2.1 to see if I get any
improvement.

--
Alessandro Tagliapietra


On Wed, Jun 5, 2019 at 4:45 PM Guozhang Wang <wangg...@gmail.com> wrote:

> Hello Alessandro,
>
> What did you do for `restarting the app online`? I'm not sure I follow the
> difference between "restart the streams app" and "restart the app online"
> from your description.
>
>
> Guozhang
>
>
> On Wed, Jun 5, 2019 at 10:42 AM Alessandro Tagliapietra <
> tagliapietra.alessan...@gmail.com> wrote:
> >
> > Hello everyone,
> >
> > I've a small streams app, the configuration and part of the code I'm
> using
> > can be found here
> > https://gist.github.com/alex88/6b7b31c2b008817a24f63246557099bc
> > There's also the log when the app is started locally and when the app is
> > started on our servers connecting to the confluent cloud kafka broker.
> >
> > The problem is that locally everything is working properly, if I restart
> > the streams app it just continues where it left, if I restart the app
> > online it reprocesses the whole topic.
> >
> > That shouldn't happen right?
> >
> > Thanks in advance
> >
> > --
> > Alessandro Tagliapietra
>
>
>
> --
> -- Guozhang
>

Reply via email to