Eugene McDonnell wrote:
> Vector 16.1 (July 1997), which is available on line, has my article
> APWJ "Oh, No, Not Eigenvalues Again". It has a quite simple solution
> to get the eigenvalues of a matrix. I mean REALLY simple.
I was very pleased to read Eugene's article, since I had tried a
similar thing and failed. As he notes, this has excellent pedagogical
value since it both uses the basic definition and then has some nifty
J, by finding the roots of the characteristic polynomial det(A-t*I).
Even if the determinant calculation were more efficient, this is not
recommended for large matrices. The reason is that finding roots of
polynomials can be highly unstable. If you have the roots, get the
polynomial, and then find the roots again, you may not get anything
like the ones you started with. Here is an example using the
Wilkinson polynomial.
]r=: |. (>:i.20)
20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
>{: p. p. 1; r
20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
]r=: |. (>:i.20)+ ((19#0),1e_9)
20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
>{: p. p. 1; r
19.9889 19.0852 17.6449j0.413927 17.6449j_0.413927 15.5414j0.812653
15.5414j_0.812653 13.3806j0.667369 13.3806j_0.667369 11.6633 11.1483
9.9816 9.00015 8.00058 6.99988 6.00001 5 4 3 2 1
In the second example, we do not get the roots back. This is not a
deficiency in J: it is inherent in the process. Finding roots of
polynomials tends to be avoided in practice.
LAPACK puts the matrix in upper Hessenberg form (to allow for
ill-conditioning) and then puts it in QR or some similar form.
Best wishes,
John
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm