[ https://issues.apache.org/jira/browse/SPARK-18036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15765032#comment-15765032 ]
Ilya Matiach commented on SPARK-18036: -------------------------------------- Weichen Xu, are you working on this issue or have you resolved it? I am interested in investigating this bug. > Decision Trees do not handle edge cases > --------------------------------------- > > Key: SPARK-18036 > URL: https://issues.apache.org/jira/browse/SPARK-18036 > Project: Spark > Issue Type: Bug > Components: ML, MLlib > Reporter: Seth Hendrickson > Priority: Minor > > Decision trees/GBT/RF do not handle edge cases such as constant features or > empty features. For example: > {code} > val dt = new DecisionTreeRegressor() > val data = Seq(LabeledPoint(1.0, Vectors.dense(Array.empty[Double]))).toDF() > dt.fit(data) > java.lang.UnsupportedOperationException: empty.max > at scala.collection.TraversableOnce$class.max(TraversableOnce.scala:229) > at scala.collection.mutable.ArrayOps$ofInt.max(ArrayOps.scala:234) > at > org.apache.spark.ml.tree.impl.DecisionTreeMetadata$.buildMetadata(DecisionTreeMetadata.scala:207) > at org.apache.spark.ml.tree.impl.RandomForest$.run(RandomForest.scala:105) > at > org.apache.spark.ml.regression.DecisionTreeRegressor.train(DecisionTreeRegressor.scala:93) > at > org.apache.spark.ml.regression.DecisionTreeRegressor.train(DecisionTreeRegressor.scala:46) > at org.apache.spark.ml.Predictor.fit(Predictor.scala:90) > ... 52 elided > {code} > as well as > {code} > val dt = new DecisionTreeRegressor() > val data = Seq(LabeledPoint(1.0, Vectors.dense(0.0, 0.0, 0.0))).toDF() > dt.fit(data) > java.lang.UnsupportedOperationException: empty.maxBy > at scala.collection.TraversableOnce$class.maxBy(TraversableOnce.scala:236) > at > scala.collection.SeqViewLike$AbstractTransformed.maxBy(SeqViewLike.scala:37) > at > org.apache.spark.ml.tree.impl.RandomForest$.binsToBestSplit(RandomForest.scala:846) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org