Our lab need to do some simulation on online social networks. We need to
handle a 5000*5000 adjacency matrix, namely, to get its largest eigenvalue
and corresponding eigenvector. Matlab can be used but it is time-consuming.
Is Spark effective in linear algebra calculations and transformations?
Later we would have 5000000*5000000 matrix processed. It seems emergent
that we should find some distributed computation platform.

I see SVD has been implemented and I can get eigenvalues of a matrix
through this API.  But when I want to get both eigenvalues and eigenvectors
or at least the biggest eigenvalue and the corresponding eigenvector, it
seems that current Spark doesn't have such API. Is it possible that I write
eigenvalue decomposition from scratch? What should I do? Thanks a lot!


Miles Yao




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-can-I-implement-eigenvalue-decomposition-in-Spark-tp11646.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to