Spark MLlib KMeansModel provides "computeCost" function which return the
sum of squared distances of points to their nearest center as the k-means
cost on the given dataset.
Thanks
Yanbo
2016-07-24 17:30 GMT-07:00 janardhan shetty :
> Hi,
>
> I was trying to evaluate
Hi,
I was trying to evaluate k-means clustering prediction since the exact
cluster numbers were provided before hand for each data point.
Just tried the Error = Predicted cluster number - Given number as brute
force method.
What are the evaluation metrics available in Spark for K-means