[GitHub] spark pull request #14747: [SPARK-17086][ML] Fix InvalidArgumentException is...

2016-08-24 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/14747


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14747: [SPARK-17086][ML] Fix InvalidArgumentException is...

2016-08-23 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/14747#discussion_r75846911
  
--- Diff: 
mllib/src/main/scala/org/apache/spark/ml/feature/QuantileDiscretizer.scala ---
@@ -114,7 +114,12 @@ final class QuantileDiscretizer @Since("1.6.0") 
(@Since("1.6.0") override val ui
 splits(0) = Double.NegativeInfinity
 splits(splits.length - 1) = Double.PositiveInfinity
 
-val bucketizer = new Bucketizer(uid).setSplits(splits)
+val cutpoints = splits.distinct
+if (splits.length != cutpoints.length) {
+  log.warn("Some quantiles were identical. Bucketing to " + 
(cutpoints.length - 1) +
--- End diff --

Nit: you can use string interpolation here. That part was fine. `s"... 
${cutpoints.length - 1} ..."`. Super nit while we're still here: cutpoints is a 
term not used elsewhere. Just "distinctSplits"?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org