Hi Hoai-Thu, the issue of private default constructor is unlikely the cause
here, since Lance was already able to load/deserialize the model object.

And on that side topic, I wish all serdes libraries would just use
constructor.setAccessible(true) by default :-) Most of the time that
privacy is not about serdes reflection restrictions.

Sent while mobile. Pls excuse typos etc.
On Aug 14, 2014 1:58 AM, "Hoai-Thu Vuong" <thuv...@gmail.com> wrote:

> A man in this community give me a video:
> https://www.youtube.com/watch?v=sPhyePwo7FA. I've got a same question in
> this community and other guys helped me to solve this problem. I'm trying
> to load MatrixFactorizationModel from object file, but compiler said that,
> I can not create object because the constructor is private. To solve this,
> I put my new object to same package as MatrixFactorizationModel. Luckly it
> works.
>
>
> On Wed, Aug 13, 2014 at 9:20 PM, Christopher Nguyen <c...@adatao.com>
> wrote:
>
>> Lance, some debugging ideas: you might try model.predict(RDD[Vector]) to
>> isolate the cause to serialization of the loaded model. And also try to
>> serialize the deserialized (loaded) model "manually" to see if that throws
>> any visible exceptions.
>>
>> Sent while mobile. Pls excuse typos etc.
>> On Aug 13, 2014 7:03 AM, "lancezhange" <lancezha...@gmail.com> wrote:
>>
>>> my prediction codes are simple enough as follows:
>>>
>>>   *val labelsAndPredsOnGoodData = goodDataPoints.map { point =>
>>>   val prediction = model.predict(point.features)
>>>   (point.label, prediction)
>>>   }*
>>>
>>> when model is the loaded one, above code just can't work. Can you catch
>>> the
>>> error?
>>> Thanks.
>>>
>>> PS. i use spark-shell under standalone mode, version 1.0.0
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-save-mllib-model-to-hdfs-and-reload-it-tp11953p12035.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>
>
> --
> Thu.
>

Reply via email to