Github user mgaido91 commented on a diff in the pull request: https://github.com/apache/spark/pull/20629#discussion_r181547107 --- Diff: python/pyspark/ml/clustering.py --- @@ -322,7 +323,11 @@ def computeCost(self, dataset): """ Return the K-means cost (sum of squared distances of points to their nearest center) for this model on the given data. + + ..note:: Deprecated in 2.4.0. It will be removed in 3.0.0. Use ClusteringEvaluator instead. """ + warnings.warn("Deprecated in 2.4.0. It will be removed in 3.0.0. Use ClusteringEvaluator" --- End diff -- yes, or I can also update it here once we establish for sure what the new API has to look like, as you prefer.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org