How large are your models?



Spark job server does allow synchronous job execution and with a "warm" 
long-lived context it will be quite fast - but still in the order of a second 
or a few seconds usually (depending on model size - for very large models 
possibly quite a lot more than that).




What are your use cases for SQL during recommendation? Filtering?





If your recommendation needs are real-time (<1s) I am not sure job server and 
computing the refs with spark will do the trick (though those new BLAS-based 
methods may have given sufficient speed up).



—
Sent from Mailbox

On Mon, Jun 22, 2015 at 11:17 PM, Debasish Das <debasish.da...@gmail.com>
wrote:

> Models that I am looking for are mostly factorization based models (which
> includes both recommendation and topic modeling use-cases).
> For recommendation models, I need a combination of Spark SQL and ml model
> prediction api...I think spark job server is what I am looking for and it
> has fast http rest backend through spray which will scale fine through akka.
> Out of curiosity why netty?
> What model are you serving?
> Velox doesn't look like it is optimized for cases like ALS recs, if that's
> what you mean. I think scoring ALS at scale in real time takes a fairly
> different approach.
> The servlet engine probably doesn't matter at all in comparison.
> On Sat, Jun 20, 2015, 9:40 PM Debasish Das <debasish.da...@gmail.com> wrote:
>> After getting used to Scala, writing Java is too much work :-)
>>
>> I am looking for scala based project that's using netty at its core (spray
>> is one example).
>>
>> prediction.io is an option but that also looks quite complicated and not
>> using all the ML features that got added in 1.3/1.4
>>
>> Velox built on top of ML / Keystone ML pipeline API and that's useful but
>> it is still using javax servlets which is not netty based.
>>
>> On Sat, Jun 20, 2015 at 10:25 AM, Sandy Ryza <sandy.r...@cloudera.com>
>> wrote:
>>
>>> Oops, that link was for Oryx 1. Here's the repo for Oryx 2:
>>> https://github.com/OryxProject/oryx
>>>
>>> On Sat, Jun 20, 2015 at 10:20 AM, Sandy Ryza <sandy.r...@cloudera.com>
>>> wrote:
>>>
>>>> Hi Debasish,
>>>>
>>>> The Oryx project (https://github.com/cloudera/oryx), which is Apache 2
>>>> licensed, contains a model server that can serve models built with MLlib.
>>>>
>>>> -Sandy
>>>>
>>>> On Sat, Jun 20, 2015 at 8:00 AM, Charles Earl <charles.ce...@gmail.com>
>>>> wrote:
>>>>
>>>>> Is velox NOT open source?
>>>>>
>>>>>
>>>>> On Saturday, June 20, 2015, Debasish Das <debasish.da...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> The demo of end-to-end ML pipeline including the model server
>>>>>> component at Spark Summit was really cool.
>>>>>>
>>>>>> I was wondering if the Model Server component is based upon Velox or
>>>>>> it uses a completely different architecture.
>>>>>>
>>>>>> https://github.com/amplab/velox-modelserver
>>>>>>
>>>>>> We are looking for an open source version of model server to build
>>>>>> upon.
>>>>>>
>>>>>> Thanks.
>>>>>> Deb
>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> - Charles
>>>>>
>>>>
>>>>
>>>
>>

Reply via email to