Re: [DISCUSS] FLIP-23 Model Serving

2019-05-06 Thread Boris Lublinsky
Thanks Rong, really appreciate review. See some answers below Boris Lublinsky FDP Architect boris.lublin...@lightbend.com https://www.lightbend.com/ > On May 5, 2019, at 7:52 PM, Rong Rong wrote: > > Hi Robert, Boris, > > Sorry for the late reply. I took some time to look at the implementatio

Re: [DISCUSS] FLIP-23 Model Serving

2019-05-05 Thread Rong Rong
Hi Robert, Boris, Sorry for the late reply. I took some time to look at the implementation. I think it looks good overall to me. Since this is pretty much a big PR. I would like to raise a bit of discussion beforehand: 1. In order to simplify the PR complexity, can we do the following? - the exam

Re: [DISCUSS] FLIP-23 Model Serving

2019-04-30 Thread Robert Metzger
Hey all, I'm wondering if somebody on the list can take a look at the PR from FLIP-23: https://github.com/apache/flink/pull/7446 On Mon, Oct 1, 2018 at 6:13 PM Rong Rong wrote: > Thanks for the contribution Boris!! I've been playing around with the basic > model for a while back and loved it.

Re: [DISCUSS] FLIP-23 Model Serving

2018-10-01 Thread Rong Rong
Thanks for the contribution Boris!! I've been playing around with the basic model for a while back and loved it. +1 and really looking forward to having the feature merging back to Flink ML. -- Rong On Mon, Oct 1, 2018 at 7:55 AM Fabian Hueske wrote: > Hi everybody, > > The question of how to s

Re: [DISCUSS] FLIP-23 Model Serving

2018-10-01 Thread Fabian Hueske
Hi everybody, The question of how to serve ML models in Flink applications came up in several conversations I had with Flink users in the last months. Recently, Boris approached me and he told me that he'd like to revive the efforts around FLIP-23 [1]. In the last days, Boris extended the proposa

Re: [DISCUSS] FLIP-23 Model Serving

2018-02-05 Thread Stavros Kontopoulos
Thanx @Fabian. I will update the document accordingly wrt metrics. I agree there are pros and cons. Best, Stavros On Wed, Jan 31, 2018 at 1:07 AM, Fabian Hueske wrote: > OK, I think there was plenty of time to comment on this FLIP. > I'll move it to the ACCEPTED status. > > @Stavros, please co

Re: [DISCUSS] FLIP-23 Model Serving

2018-01-30 Thread Fabian Hueske
OK, I think there was plenty of time to comment on this FLIP. I'll move it to the ACCEPTED status. @Stavros, please consider the feedback regarding the metrics. I agree with Chesnay that metrics should be primarily exposed via the metrics system. Storing them in state makes them fault-tolerant and

Re: [DISCUSS] FLIP-23 Model Serving

2018-01-22 Thread Chesnay Schepler
I'm currently looking over it, but one thing that stood out was that the FLIP proposes to use queryable state as a monitoring solution. Given that we have a metric system that integrates with plenty of commonly used metric backends this doesn't really make sense to me. Storing them in state sti

Re: [DISCUSS] FLIP-23 Model Serving

2018-01-18 Thread Fabian Hueske
Are there any more comments on the FLIP? Otherwise, I'd suggest to move the FLIP to the accepted FLIPs [1] and continue with the implementation. Also, is there a committer who'd like to shepherd the FLIP and review the corresponding PRs? Of course, everybody is welcome to review the code but we n

Re: [DISCUSS] FLIP-23 Model Serving

2017-12-04 Thread Fabian Hueske
Hi, Sorry for the late follow up. I think I understand the motivation for choosing ProtoBuf as the representation and serialization format and this makes sense to me. However, it might be a good idea to provide tooling to convert Flink types (described as TypeInformation) to ProtoBuf. Otherwise,

Re: [DISCUSS] FLIP-23 Model Serving

2017-11-28 Thread Fabian Hueske
Hi Boris and Stavros, Thanks for the responses. Ad 1) Thanks for the clarification. I think I misunderstood this part of the proposal. I interpreted the argument why to chose ProtoBuf for network encoding ("ability to represent different data types") such that different a model pipeline should wo

Re: [DISCUSS] FLIP-23 Model Serving

2017-11-28 Thread Stavros Kontopoulos
Hi Fabian thanx! > 1) Is it a strict requirement that a ML pipeline must be able to handle > different input types? > I understand that it makes sense to have different models for different > instances of the same type, i.e., same data type but different keys. Hence, > the key-based joins make se

Re: [DISCUSS] FLIP-23 Model Serving

2017-11-27 Thread Fabian Hueske
Hi Stavros, thanks for the detailed FLIP! Model serving is an important use case and it's great to see efforts to add a library for this to Flink! I've read the FLIP and would like to ask a few questions and make some suggestions. 1) Is it a strict requirement that a ML pipeline must be able to

[DISCUSS] FLIP-23 Model Serving

2017-11-23 Thread Stavros Kontopoulos
Hi guys, Let's discuss the new FLIP proposal for model serving over Flink. The idea is to combine previous efforts there and provide a library on top of Flink for serving models. https://cwiki.apache.org/confluence/display/FLINK/FLIP-23+-+Model+Serving Code from previous efforts can be found her