[ 
http://issues.apache.org/jira/browse/MATH-157?page=comments#action_12446155 ] 
            
Tyler Ward commented on MATH-157:
---------------------------------


A few notes. 

You don't have to compute the eigenvectors of both mTm and mmT, just do which 
ever one is smaller. You know you can invert that matrix (it's orthogonal, so 
just take the transpose), and you can also invert the S matrix (it's diagonal, 
invert each value on the diagonal), so just invert those two and multiply 
through the original matrix to get the other orthogonal matrix, either V or U, 
whichever you didn't solve for first. 

Also, eigenvectors computations are very expensive, it is perhaps a little more 
efficient to compute the eigenvalues at the same time that you compute the 
eigenvectors, as I believe that computation will always be an inherent part of 
an eigenvector solution. If not, then Just take the eigenvector matrix D, and 
multiply through the original matrix A, like so, DtAD, and that will produce 
the diagonal eigenvector matrix. 

It just seems like you're kindof going in the wrong direction. Solve the 
eigenvectors for mTm (assuming M has more rows that columns), then use these 
eigenvectors (call it V) to reduce the symmetric matrix mTm to the eigenvalue 
matrix S2. Take sqrt to get S. Now use these two matrices, invert and multiply 
through M to get U.  



> Add support for SVD.
> --------------------
>
>                 Key: MATH-157
>                 URL: http://issues.apache.org/jira/browse/MATH-157
>             Project: Commons Math
>          Issue Type: New Feature
>            Reporter: Tyler Ward
>         Attachments: svd.tar.gz
>
>
> SVD is probably the most important feature in any linear algebra package, 
> though also one of the more difficult. 
> In general, SVD is needed because very often real systems end up being 
> singular (which can be handled by QR), or nearly singular (which can't). A 
> good example is a nonlinear root finder. Often the jacobian will be nearly 
> singular, but it is VERY rare for it to be exactly singular. Consequently, LU 
> or QR produces really bad results, because they are dominated by rounding 
> error. What is needed is a way to throw out the insignificant parts of the 
> solution, and take what improvements we can get. That is what SVD provides. 
> The colt SVD algorithm has a serious infinite loop bug, caused primarily by 
> Double.NaN in the inputs, but also by underflow and overflow, which really 
> can't be prevented. 
> If worried about patents and such, SVD can be derrived from first principals 
> very easily with the acceptance of two postulates.
> 1) That an SVD always exists.
> 2) That Jacobi reduction works. 
> Both are very basic results from linear algebra, available in nearly any text 
> book. Once that's accepted, then the rest of the algorithm falls into place 
> in a very simple manner. 

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to