[ 
https://issues.apache.org/jira/browse/SPARK-10356?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14721502#comment-14721502
 ] 

Carsten Schnober commented on SPARK-10356:
------------------------------------------

According to 
[[Wikipedia][https://en.wikipedia.org/wiki/Norm_%28mathematics%29#p-norm], each 
value's absolute value should be used to compute the norm:

{||x||_p := (sum(|x|^p)^1/p}

For p = 1, this results in:

{||x||_1 := sum(|x|)}

I suppose the issue is thus actually located in the {norm()} method.


> MLlib: Normalization should use absolute values
> -----------------------------------------------
>
>                 Key: SPARK-10356
>                 URL: https://issues.apache.org/jira/browse/SPARK-10356
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.4.1
>            Reporter: Carsten Schnober
>              Labels: easyfix
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> The normalizer does not handle vectors with negative values properly. It can 
> be tested with the following code
> {code}
> val normalized = new Normalizer(1.0).transform(v: Vector)
> normalizer.toArray.sum == 1.0
> {code}
> This yields true if all values in Vector v are positive, but false when v 
> contains one or more negative values. This is because the values in v are 
> taken immediately without applying {{abs()}},
> This (probably) does not occur for {{p=2.0}} because the values are squared 
> and hence positive anyway.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to