[ https://issues.apache.org/jira/browse/SPARK-19714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15893821#comment-15893821 ]
Nick Pentreath commented on SPARK-19714: ---------------------------------------- If you feel that handling values outside the bucket ranges as "invalid" is reasonable - specifically including them in the special "invalid" bucket - then we can discuss if and how that could be implemented. I agree it's quite a large departure, but we could support it with a further param value such as "keepAll" which keeps both {{NaN}} and values outside of range in the special bucket. I don't see a compelling reason that this is a bug, so if you want to motivate for a change then propose an approach. I do think we should update the doc for {{handleInvalid}} - [~wojtek-szymanski] feel free to open a PR for that. > Bucketizer Bug Regarding Handling Unbucketed Inputs > --------------------------------------------------- > > Key: SPARK-19714 > URL: https://issues.apache.org/jira/browse/SPARK-19714 > Project: Spark > Issue Type: Bug > Components: ML, MLlib > Affects Versions: 2.1.0 > Reporter: Bill Chambers > > {code} > contDF = spark.range(500).selectExpr("cast(id as double) as id") > import org.apache.spark.ml.feature.Bucketizer > val splits = Array(5.0, 10.0, 250.0, 500.0) > val bucketer = new Bucketizer() > .setSplits(splits) > .setInputCol("id") > .setHandleInvalid("skip") > bucketer.transform(contDF).show() > {code} > You would expect that this would handle the invalid buckets. However it fails > {code} > Caused by: org.apache.spark.SparkException: Feature value 0.0 out of > Bucketizer bounds [5.0, 500.0]. Check your features, or loosen the > lower/upper bound constraints. > {code} > It seems strange that handleInvalud doesn't actually handleInvalid inputs. > Thoughts anyone? -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org