Hello,I am new to Spark. I am looking for a matrix inverse and multiplication 
solution. I did a quick search and found a couple of solutions but my 
requirements are:- large matrix (up to 2 millions x 2 m)- need to support 
complex double data type- preferably in Java
 There is one post 
http://databasefaq.com/index.php/answer/145090/matrix-apache-spark-distributed-computing-spark-distributed-matrix-multiply-and-pseudo-inverse-calculating
 but it is in Scala which I am not familiar with. I am able to convert this to 
Java but not able to understand this statement :
val uArray = svd.U.rows.collect.toList.map(_.toArray.toList).flatten.toArray

Could some one shed some lights ?
And also, I assume internally we use JBLAS for matrix calculation, so it should 
be doable to support Complex Double right ? thanks,
canal

Reply via email to