I believe there are 2 main methods:

1. stop serving a couple of seconds while deploying the newly trained model, 
this is supported from pio as is.
2. make a more flexible solution that can route traffic differently or cache 
results. We have a reverse proxy (openresty / nginx + lua) in front, so that we 
can do both if business requires it. 

When working with UR: Another solution would be to utilise ES aliases

I'm pretty sure other people have thought of other solutions, but it mostly 
depends on the exact use case.
I hope that helps.

Paul

> On 6 Sep 2017, at 03:34, Saarthak Chandra <chandra.saart...@gmail.com> wrote:
> 
> Hi,
> 
> Is there a way we can train a model without having to stop serving.
> I mean, if I have an app deployed, can I add/post new data to the event 
> server and train the same app without stopping it?
> 
> 
> Thanks!
> -- 
> Saarthak Chandra,
> Masters in Computer Science,
> Cornell University.

Reply via email to