[ 
https://issues.apache.org/jira/browse/SPARK-22059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-22059.
-------------------------------
    Resolution: Invalid

Questions should go to the mailing list.
No, it's a limit of how big arrays can be on the JVM.
At that scale you'll probably struggle to compute an SVD this way.

> SVD computation limit
> ---------------------
>
>                 Key: SPARK-22059
>                 URL: https://issues.apache.org/jira/browse/SPARK-22059
>             Project: Spark
>          Issue Type: Question
>          Components: MLlib
>    Affects Versions: 2.2.0
>            Reporter: Aleksandr Ovcharenko
>
> Hello guys,
> While trying to compute SVD using computeSVD() function, i am getting the 
> following warning with the follow up exception:
> 17/09/14 12:29:02 WARN RowMatrix: computing svd with k=49865 and n=191077, 
> please check necessity
> IllegalArgumentException: u'requirement failed: k = 49865 and/or n = 191077 
> are too large to compute an eigendecomposition'
> When I try to compute first 3000 singular values, I'm getting several 
> following warnings every second:
> 17/09/14 13:43:38 WARN TaskSetManager: Stage 4802 contains a task of very 
> large size (135 KB). The maximum recommended task size is 100 KB.
> The matrix size is 49865 x 191077 and all the singular values are needed.
> Is there a way to lift that limit and be able to compute whatever number of 
> singular values?
> Thank you.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to