Hello,
In MLLib with Spark 1.4, I was able to eval a model by loading it and using 
`predict` on a vector of features. I would train on Spark but use my model on 
my workflow.

In `spark.ml` it seems like the only way to eval is to use `transform` which 
only takes a DataFrame.To build a DataFrame i need a sparkContext or 
SQLContext, so it doesn't seem to be possible to eval outside of Spark.

Is there either a way to build a DataFrame without a sparkContext, or predict 
with a vector or list of features without a DataFrame?
Thanks                                    

Reply via email to