A good place to start is to look at Eugene's Vector article from 1997, *Oh, No, Not Eigenvalues Again!<http://www.jsoftware.com/jwiki/Doc/Articles/Play141> * .
The course in college which covered eigenvalues and eigenvectors was not one of my better courses. I do remember the prof explaining the importance of eigenvectors as follows: an eigenvector v is one such that (A +/ .*v)=lambda*v . This means that the action of A on v is just a stretching of v, a particularly simple transformation of v among possible weird and wonderful transformations, like using kilometers per hour instead of miles per hour. If you are serious about computing eigenvalues and eigenvectors I suggest you use LAPACK. Such computation is tricky involving deep and detailed knowledge of numerical analysis. On Fri, Feb 3, 2012 at 3:03 PM, Don Guinn <dongu...@gmail.com> wrote: > > This is a long one that may be turned into a J essay if I ever figure it > all out. I will not feel bad if you skip reading it. > > A couple of weeks ago I dug up a topic I studied only enough to get through > a test in college. Eigenvalues. First, how is it spelled? Both Eigenvalue > and Eiganvalue work on GOOGLE. Second, lots of pages described how to find > Eigenvalues, but nobody tackled Eigenvectors. And then, what are > Eigenvectors good for? Lots of pages talk about applications but I thought > it better to understand Eigenvalues and Eigenvectors before I tackled > applications. > > I am not through figuring out Eigenvalues yet, but I am tossing out what I > have so far as Linda is looking for topics for the classroom and a couple > of things here she might find interesting, if she has not already done > them. Second, I would appreciate some suggestions as to where to go next. > > I searched the web on Eigenvalues and I looked at the J Essay "Eigenvalues" > by John Randall and I could use them to calculate Eigenvalues quite nicely. > Nothing about finding Eigenvectors anywhere. Rather than getting involved > in the issues of how to best calculate Eigenvalues I wanted to start from > scratch. I wanted to use the equation (Ax-λIx)=0 for finding Eigenvalues as > nearly as I could in J. There are references in J about polynomial > arithmetic. Given that and the ability of J adverbs and conjunctions to > take defined verbs as arguments, can this equation be translated into J? > > I chose to represent polynomials similarly to their description in the verb > "p." only box the coefficients. For example: > (x^2 + 3x + 2) <==> <2 3 1 > > Defined a few tools for doing polynomial arithmetic: > Ptimes=:+//.@:(*/)&.> > Pminus=:(-/@:,:)&.> > Pdet=:Pminus/ . Ptimes > Pconstant=:(<@,"0`]@.(*@L.)) > > Linda - Every math senior has had to do polynomial arithmetic and if they > are anything like I was, they found it easy to understand but tedious and > boring to do on paper. J is unique as a programming language that can do > polynomial arithmetic easily. Also, after having to calculate the > determinants of 4 by 4 matrices using pencil and paper had my teacher been > able to show me a machine that would easily calculate determinants and do > polynomial arithmetic, he would surly have had my attention. > > (I am sending this in as Courier New and hopefully will display OK) > > Books give three steps to determine the Eigenvalues of a matrix. > 1. Build a polynomial matrix of the matrix with λ subtracted from the main > diagonal. > Given matrix A: > ]A=:Pconstant 1 2 1,6 _1 0,:_1 _2 _1 > +--+--+--+ > |1 |2 |1 | > +--+--+--+ > |6 |_1|0 | > +--+--+--+ > |_1|_2|_1| > +--+--+--+ > Lamda=:<0 1 NB. Lamda is the polynomial "0 + x" > ]AL=:A Pminus (Lamda Ptimes =i.3) > +----+-----+-----+ > |1 _1|2 0 |1 0 | > +----+-----+-----+ > |6 0 |_1 _1|0 0 | > +----+-----+-----+ > |_1 0|_2 0 |_1 _1| > +----+-----+-----+ > > 2. Calculate the Characteristic Polynomial > ]CP=:Pdet AL > +----------+ > |0 12 _1 _1| > +----------+ > > 3. Find its factors. (Nice that J has a tool to factor polynomials) > ;}.p.>CP > _4 3 0 > > This is just how my textbook described calculating Eigenvalues, only here > it's done in J. I'm sure that there are faster and more accurate methods > for finding Eigenvalues; however, it was fun to so directly replace > traditional mathematical notation used in the book with J. And still have > it read like the book. > > My next challenge was finding the Eigenvectors. Not too hard to do with > pencil and paper, but getting J to do it is another story. One can easily > build a matrix to find an Eigenvector from each Eigenvalue by putting in > values for A-Iλ; however, a given Eigenvalue may have more than one > Eigenvector associated with it. For a 3 by 3 matrix it is essentially 3 > equations with 3 unknowns with no constant and all equal to zero. And all > unknowns = zero is not acceptable. The matrix is singular as λ is a root so > the rows are not independent. A row can be eliminated and arbitrarily pick > a value for one of the unknowns then solve for the other two. > > This is where I am not sure as to what to do. Picking n-1 adjacent rows > works well when all Eigenvalues are unique, but when an Eigenvalue is > repeated I am not sure if adjacent rows is sufficient for finding > solutions. I am not sure about the code below for finding Eigenvectors when > multiple orthogonal Eigenvectors satisfy an Eigenvector, but it hasn't > failed yet. > > The expression (M -: EV dot (E*=i.#E) dot %. EV) is supposed to hold, > where M is the matrix, > E is the list of Eigenvalues, > and EV is the matrix of corresponding Eigenvectors in columns > > This test is performed by Eigen_test to verify that the calculation is > correct. > > NB. Start of script __________________________ > > NB. Polynomial arithmetic operations: > Ptimes=:+//.@:(*/)&.> > Pminus=:(-/@:,:)&.> > Pdet=:Pminus/ . Ptimes > Pconstant=:(<@,"0`]@.(*@L.)) > > NB. Eigen names > Lamda=:<0 1 > Characteristic_Polynomial=:13 : 'Pdet y Pminus Lamda Ptimes =i.#y' > Eigenvalues=:13 : ';}.p.>Characteristic_Polynomial y' > > Eigenvectors=:4 : 0 > inv=.%. :: 0: > for_i. ((#y)-x)<;._3 y do. > 'v c'=.(-x)(([:|:}.);{.)|:;i > if. $im=.inv v do. > <(-=i.x),~|:c+/ .*|:im > return. > end. > end. > ('No Eigenvector found for Eigenvalue ',":x) (13!:8) 3 > ) > > Eigen_test=:4 : 0 > dot=.+/ .* > 'e v'=.y > if. -.x-:v dot (e*=i.#e) dot %. v do. > ('Eigenvalues and Eigenvectors invalid',,LF,.":y) (13!:8) 3 > end. > y > ) > > Eigens=:3 : 0 > py=.Pconstant y NB. Convert y to polynomial form if necessary. > 'vals counts'=.(](];[:+/=/)~.)Eigenvalues py > m=.y-"2 vals*"0 2=i.#y > vects=.,.&.>/counts Eigenvectors"0 2 m > y Eigen_test r=.(counts#vals);vects > ) > > NB. End of Script _____________________________ > > The verb Eigens puts the calculations together and Eigenvalues and > Eigenvectors are returned as two boxed items. > > ]A=:1 2 1,6 _1 0,:_1 _2 _1 > 1 2 1 > 6 _1 0 > _1 _2 _1 > Eigens A > +------+-----------+ > |_4 3 0| 1 1 1r13| > | |_2 3r2 6r13| > | |_1 _1 _1| > +------+-----------+ > > This is the same matrix covered earlier. Where did the rationals come from? > I never used rationals. > > ]B=:3 2 4,2 0 2,:4 2 3 > 3 2 4 > 2 0 2 > 4 2 3 > Eigens B > +-------+-----------+ > |8 _1 _1| _1 1r2 1| > | |_1r2 _1 0| > | | _1 0 _1| > +-------+-----------+ > > B has the Eigenvalue _1 occur twice. Therefore it has two Eigenvectors > associated with it. It is extremely difficult to compare these Eigenvectors > with those calculated other ways as linear combinations of them can look > quite different. > > Eigens 7#,:>:i.7 > +--------------+--------------------+ > |28 0 0 0 0 0 0|_1 2 3 4 5 6 7| > | |_1 _1 0 0 0 0 0| > | |_1 0 _1 0 0 0 0| > | |_1 0 0 _1 0 0 0| > | |_1 0 0 0 _1 0 0| > | |_1 0 0 0 0 _1 0| > | |_1 0 0 0 0 0 _1| > +--------------+--------------------+ > > Interesting. > > A matrix can be raised to a power given its Eigenvalues and Eigenvectors by > applying the same formula as used in Eigen_test to verify the solution. > Back to A again. > ]'e v'=.Eigens A > +------+-----------+ > |_4 3 0| 1 1 1r13| > | |_2 3r2 6r13| > | |_1 _1 _1| > +------+-----------+ > dot=:+/ .* > v dot ((*:e)*=i.3) dot %. v > 12 _2 0 > 0 13 6 > _12 2 0 > A dot A > 12 _2 0 > 0 13 6 > _12 2 0 > > Raise e, the Eigenvalues, to other powers and this still works. > > For the fun of it I tried square root. > ]rA=:v dot ((%:e)*=i.3) dot %. v > 1.31966j0.642857 0.494872j_0.571429 0.329914j_0.214286 > 1.97949j_1.28571 0.742307j1.14286 0.494872j0.428571 > _1.31966j_0.642857 _0.494872j0.571429 _0.329914j0.214286 > rA dot rA > 1j_5.55112e_17 2j5.55112e_17 1j2.77556e_17 > 6 _1 _1.66533e_16j2.77556e_17 > _1j5.55112e_17 _2j_5.55112e_17 _1j_2.77556e_17 > A-rA dot rA > 8.88178e_16j5.55112e_17 4.44089e_16j_5.55112e_17 2.22045e_16j_2.77556e_17 > 0 2.22045e_16 1.66533e_16j_2.77556e_17 > _8.88178e_16j_5.55112e_17 _4.44089e_16j5.55112e_17 _2.22045e_16j2.77556e_17 > > OK, got some round-off errors but pretty close. I don't know if this is > supposed to work or not. > > > I am not at all comfortable with the verb Eigenvectors, but it has seems to > have worked OK so far. > > I would appreciate comments and suggestions and where to go to better > understand the applications of Eigenvalues and Eigenvectors. > ---------------------------------------------------------------------- > For information about J forums see http://www.jsoftware.com/forums.htm ---------------------------------------------------------------------- For information about J forums see http://www.jsoftware.com/forums.htm