Hey,
Thanks Xiangrui Meng and Cheng Lian for your valuable suggestions.
It works!
Divyansh Jain.
On Tue, January 20, 2015 2:49 pm, Xiangrui Meng wrote:
> You can save the cluster centers as a SchemaRDD of two columns (id:
> Int, center: Array[Double]). When you load it back, you can con
Hey people,
I have run into some issues regarding saving the k-means mllib model in
Spark SQL by converting to a schema RDD. This is what I am doing:
case class Model(id: String, model:
org.apache.spark.mllib.clustering.KMeansModel)
import sqlContext.createSchemaRDD
val rowRdd = sc.makeRD