Hi,

Just use  ""objectFile" instead of "objectFile[PipelineModel]" for callJMethod. 
You can take the objectFile() in context.R as example.

Since the SparkContext created in SparkR is actually a JavaSparkContext, there 
is no need to pass the implicit ClassTag.

-----Original Message-----
From: Shivaram Venkataraman [mailto:shiva...@eecs.berkeley.edu] 
Sent: Thursday, December 10, 2015 8:21 AM
To: Chris Freeman
Cc: dev@spark.apache.org
Subject: Re: Specifying Scala types when calling methods from SparkR

The SparkR callJMethod can only invoke methods as they show up in the Java byte 
code. So in this case you'll need to check the SparkContext byte code (with 
javap or something like that) to see how that method looks. My guess is the 
type is passed in as a class tag argument, so you'll need to do something like 
create a class tag for the LinearRegressionModel and pass that in as the first 
or last argument etc.

Thanks
Shivaram

On Wed, Dec 9, 2015 at 10:11 AM, Chris Freeman <cfree...@alteryx.com> wrote:
> Hey everyone,
>
> I’m currently looking at ways to save out SparkML model objects from 
> SparkR and I’ve had some luck putting the model into an RDD and then 
> saving the RDD as an Object File. Once it’s saved, I’m able to load it 
> back in with something like:
>
> sc.objectFile[LinearRegressionModel](“path/to/model”)
>
> I’d like to try and replicate this same process from SparkR using the 
> JVM backend APIs (e.g. “callJMethod”), but so far I haven’t been able 
> to replicate my success and I’m guessing that it’s (at least in part) 
> due to the necessity of specifying the type when calling the objectFile 
> method.
>
> Does anyone know if this is actually possible? For example, here’s 
> what I’ve come up with so far:
>
> loadModel <- function(sc, modelPath) {
>   modelRDD <- SparkR:::callJMethod(sc,
>
> "objectFile[PipelineModel]",
>                                 modelPath,
>         SparkR:::getMinPartitions(sc, NULL))
>   return(modelRDD)
> }
>
> Any help is appreciated!
>
> --
> Chris Freeman
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional 
commands, e-mail: dev-h...@spark.apache.org

Reply via email to