Hehe, Sean. I knew that (and I knew the answer), but meant to ask a
co-question to help to find the answer *together* :)

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Jul 22, 2016 at 12:52 PM, Sean Owen <so...@cloudera.com> wrote:
> Machine Learning
>
> If you mean, how do you distribute a new model in your application,
> then there's no magic to it. Just reference the new model in the
> functions you're executing in your driver.
>
> If you implemented some other manual way of deploying model info, just
> do that again. There's no special thing to know.
>
> On Fri, Jul 22, 2016 at 11:39 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>> Hi,
>>
>> What's a ML model?
>>
>> (I'm sure once we found out the answer you'd know the answer for your
>> question :))
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Fri, Jul 22, 2016 at 11:49 AM, Sergio Fernández <wik...@apache.org> wrote:
>>> Hi,
>>>
>>>  I have one question:
>>>
>>> How is the ML models distribution done across all nodes of a Spark cluster?
>>>
>>> I'm thinking about scenarios where the pipeline implementation does not
>>> necessary need to change, but the models have been upgraded.
>>>
>>> Thanks in advance.
>>>
>>> Best regards,
>>>
>>> --
>>> Sergio Fernández
>>> Partner Technology Manager
>>> Redlink GmbH
>>> m: +43 6602747925
>>> e: sergio.fernan...@redlink.co
>>> w: http://redlink.co
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to