Hi Joseph

I am using spark 1.1.0 the latest version, I will try to update to the
current master and check.

The example I am running is JavaDecisionTree, the dataset is of libsvm
format containing 

1. 45 instances of training sample. 
2. 5 features
3. I am not sure what is feature type, but there are no categorical features
being passed in the example.
4. Three labels, not sure what label type is.

The example runs fine with 100 maxBins as value, but when I change this to
say 50 or 30 I get the exception.
Also could you please let me know what should be the default value for
maxBins(API says 100 as default but it did not work in this case)?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-MLLIB-Decision-Tree-ArrayIndexOutOfBounds-Exception-tp16907p16988.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to