[ 
https://issues.apache.org/jira/browse/SPARK-17001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15422572#comment-15422572
 ] 

Apache Spark commented on SPARK-17001:
--------------------------------------

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/14663

> Enable standardScaler to standardize sparse vectors when withMean=True
> ----------------------------------------------------------------------
>
>                 Key: SPARK-17001
>                 URL: https://issues.apache.org/jira/browse/SPARK-17001
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 2.0.0
>            Reporter: Tobi Bosede
>            Priority: Minor
>
> When withMean = true, StandardScaler will not handle sparse vectors, and 
> instead throw an exception. This is presumably because subtracting the mean 
> makes a sparse vector dense, and this can be undesirable. 
> However, VectorAssembler generates vectors that may be a mix of sparse and 
> dense, even when vectors are smallish, depending on their values. It's common 
> to feed this into StandardScaler, but it would fail sometimes depending on 
> the input if withMean = true. This is kind of surprising.
> StandardScaler should go ahead and operate on sparse vectors and subtract the 
> mean, if explicitly asked to do so with withMean, on the theory that the user 
> knows what he/she is doing, and there is otherwise no way to make this work.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to