If you are interested in multilabel (not multiclass), you might want to take a look at SPARK-7015 https://github.com/apache/spark/pull/5830/files. It is supposed to perform one-versus-all transformation on classes, which is usually how multilabel classifiers are built.
Alexander From: Joseph Bradley [mailto:jos...@databricks.com] Sent: Tuesday, May 05, 2015 3:44 PM To: DB Tsai Cc: peterg; user@spark.apache.org Subject: Re: Multilabel Classification in spark If you mean "multilabel" (predicting multiple label values), then MLlib does not yet support that. You would need to predict each label separately. If you mean "multiclass" (1 label taking >2 categorical values), then MLlib supports it via LogisticRegression (as DB said), as well as DecisionTree and RandomForest. Joseph On Tue, May 5, 2015 at 1:27 PM, DB Tsai <dbt...@dbtsai.com<mailto:dbt...@dbtsai.com>> wrote: LogisticRegression in MLlib package supports multilable classification. Sincerely, DB Tsai ------------------------------------------------------- Blog: https://www.dbtsai.com On Tue, May 5, 2015 at 1:13 PM, peterg <pe...@garbers.me<mailto:pe...@garbers.me>> wrote: > Hi all, > > I'm looking to implement a Multilabel classification algorithm but I am > surprised to find that there are not any in the spark-mllib core library. Am > I missing something? Would someone point me in the right direction? > > Thanks! > > Peter > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Multilabel-Classification-in-spark-tp22775.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: > user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org> > For additional commands, e-mail: > user-h...@spark.apache.org<mailto:user-h...@spark.apache.org> > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org> For additional commands, e-mail: user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>