You can also serialize the model and use it in other places.

Best Regards,
Sonal
Founder, Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>



On Wed, Dec 10, 2014 at 5:32 PM, Yanbo Liang <yanboha...@gmail.com> wrote:

> Hi Klaus,
>
> There is no ideal method but some workaround.
> Train model in Spark cluster or YARN cluster, then use RDD.saveAsTextFile
> to store this model which include weights and intercept to HDFS.
> Load weights file and intercept file from HDFS, construct a GLM model, and
> then run model.predict() method, you can get what you want.
>
> The Spark community also have some ongoing work about export model with
> PMML.
>
> 2014-12-10 18:32 GMT+08:00 Simon Chan <simonc...@gmail.com>:
>
>> Hi Klaus,
>>
>> PredictionIO is an open source product based on Spark MLlib for exactly
>> this purpose.
>> This is the tutorial for classification in particular:
>> http://docs.prediction.io/classification/quickstart/
>>
>> You can add custom serving logics and retrieve prediction result through
>> REST API/SDKs at other places.
>>
>> Simon
>>
>>
>> On Wed, Dec 10, 2014 at 2:25 AM, Klausen Schaefersinho <
>> klaus.schaef...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>
>>> I would like to use Spark to train a model, but use the model in some
>>> other place,, e.g. a servelt to do some classification in real time.
>>>
>>> What is the best way to do this? Can I just copy I model file or
>>> something and load it in the servelt? Can anybody point me to a good
>>> tutorial?
>>>
>>>
>>> Cheers,
>>>
>>>
>>> Klaus
>>>
>>>
>>>
>>> --
>>> “Overfitting” is not about an excessive amount of physical exercise...
>>>
>>
>>
>

Reply via email to