[ https://issues.apache.org/jira/browse/SPARK-21306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yanbo Liang resolved SPARK-21306. --------------------------------- Resolution: Fixed Fix Version/s: 2.3.0 2.2.1 2.1.2 2.0.3 > OneVsRest Conceals Columns That May Be Relevant To Underlying Classifier > ------------------------------------------------------------------------ > > Key: SPARK-21306 > URL: https://issues.apache.org/jira/browse/SPARK-21306 > Project: Spark > Issue Type: Bug > Components: ML > Affects Versions: 2.1.1 > Reporter: Cathal Garvey > Assignee: Yan Facai (颜发才) > Priority: Critical > Labels: classification, ml > Fix For: 2.0.3, 2.1.2, 2.2.1, 2.3.0 > > > Hi folks, thanks for Spark! :) > I've been learning to use `ml` and `mllib`, and I've encountered a block > while trying to use `ml.classification.OneVsRest` with > `ml.classification.LogisticRegression`. Basically, [here in the > code|https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/classification/OneVsRest.scala#L320], > only two columns are being extracted and fed to the underlying classifiers.. > however with some configurations, more than two columns are required. > Specifically: I want to do multiclass learning with Logistic Regression, on a > very imbalanced dataset. In my dataset, I have lots of imbalances, so I was > planning to use weights. I set a column, `"weight"`, as the inverse frequency > of each field, and I configured my `LogisticRegression` class to use this > column, then put it in a `OneVsRest` wrapper. > However, `OneVsRest` strips all but two columns out of a dataset before > training, so I get an error from within `LogisticRegression` that it can't > find the `"weight"` column. > It would be nice to have this fixed! I can see a few ways, but a very > conservative fix would be to include a parameter in `OneVsRest.fit` for > additional columns to `select` before passing to the underlying model. > Thanks! -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org