Re: [math] eigenvector doubts and issues

2013-11-13 Thread andrea antonello
Hi Thomas,

 are you sure that your eigenvectors are drawn correctly?

I assume yes. I used the eigenvectors with the highest and lowest
eigenvalue, since in the simpler 3d case it was giving me the plane
exactly cutting through the dataset as I wished it. But you are most
probably right in using a plane that connects the two lower eigenvalue
eigenvectors. I was using the wrong one for this purpose I guess.

 I tried to reproduce your results with geogebra, and in my case the
 plane defined by the two eigenvectors with the least eigenvalue seems to
 split the values quite well, see this image:

 http://people.apache.org/~tn/pca.png

Actually it qualitatively splits the dataset more or less the same as
using the first and last eigenvector: one 2 dots are on the wrong side
of the plane.

I was hoping to be able to get more precise results for such simple
examples of data.

In fact, if I make the dataset more regular (i.e. equal intervals for
x and y), things work as needed:

double[] x = {1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3, 4, 4, 4, 4};
double[] y = {0, 1, 2, 3, 0, 1, 2, 3, 0, 1, 2, 3, 0, 1, 2, 3};
double[] z = {1, 1, 1, 2, 1, 1, 2, 2, 1, 1, 2, 2, 1, 2, 2, 2};


In that case the plane between the two lower eigenvalue eigenvectors
splits the dataset in higher and lower.

http://en.zimagez.com/zimage/eigenvectors3dcase3.php

I will do some more tests with complex data.

Thanks,
Andrea


 The results from the computations:

 covariance:
   {0.7272727273,0.0,0.1818181818},
   {0.0,0.3409090909,0.2272727273},
   {0.1818181818,0.2272727273,0.2727272727}}

 lambda1 =  0.8056498828134406
 v1 =  {0.9015723558; 0.1900593782; 0.3886447222}

 lambda2 = 0.4874287594020183
 v2 = {-0.3799516758; 0.7774478203; 0.5012101464}


 lambda3 = 0.04783044869363171
 v3 = {-0.2068913033; -0.5995434259; 0.7731388421}

 The plane is constructed with v2 and v3.

 Thomas

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-12 Thread andrea antonello
Hi Thomas,
you are definitely right. I was fooled by the nice result of the
JAMA calculation, not noticing that it is there that I was extracting
values in a strange order.

I now made some 3x3 tests on known results and also with the A * v =
lambda * v constraint and everything is alright.

I still have to figure out how to properly use the results to split
higher elevation parts from lower z parts in a x,y,z dataset, but the
eigenvectors and values are in the right place now.

Thanks,
Andrea




On Mon, Nov 11, 2013 at 1:21 PM, Thomas Neidhart
thomas.neidh...@gmail.com wrote:
 On 11/11/2013 11:40 AM, andrea antonello wrote:
 Hi Thomas,
 thanks for your reply.

 the result of CM and jama are identical, the difference is just in the
 way how the data is stored.

 Afaik in jama calling getV() returns a vector in row format whereas in
 CM the are stored in column format.

 If you transpose the matrix (or call getVT()) you will see that the
 vectors are identical, except for the signs and order. The reason for
 this is that for CM the eigenvalues/eigenvectors are sorted in
 descending order in case of a symmetric matrix, which is the case for
 your matrix.

 the problems is that the result is not just different in the
 transposed way. And in fact if I pick the getVT,results are still not
 the same.
 The result seems to be reflected on the secondary diagonal, not the
 primary diagonal.

 Furthermore, if I use the API, I do not expect to be problems of rows
 and columns, so if I use:

 double eigenValue = eigenDecomposition.getRealEigenvalue(i);
 RealVector eigenVector = eigenDecomposition.getEigenvector(i);

 I expect the eigenvector and eigenvalue to be the right ones for the
 given index, no matter how the results are given in the matrixes.

 But the results I get are, for the same eigenvalue:
 CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
 0.19005937823202243, 0.38864472217295326
 JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
 0.5012101463530931, -0.38864472217295326]

 I am still quite puzzled about what I am missing.

 I do not know how you get the eigenvector from jama, as there is no
 getEigenVector method.

 The class javadoc of their EigenvalueDecomposition states:

 columns of V represent the eigenvectors in the sense that A*V = V*D,
 i.e. A.times(V) equals V.times(D).  The matrix V may be badly
 conditioned, or even singular, so the validity of the equation
 A = V*D*inverse(V) depends upon V.cond().

 Looking at your example it looks like you took the rows of V to extract
 your eigenvector.

 You can also easily verify if your eigenvector is correct:

  A * v = lambda * v

 Thomas

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-12 Thread Thomas Neidhart
Hi Andrea,

good to hear.
Sounds like an interesting problem you are working on.
If you have found a good solution feel free to come back to use, we might
want to include it in our user guide how an EigenDecomposition could be
used.

Thanks,

Thomas



On Tue, Nov 12, 2013 at 11:31 AM, andrea antonello 
andrea.antone...@gmail.com wrote:

 Hi Thomas,
 you are definitely right. I was fooled by the nice result of the
 JAMA calculation, not noticing that it is there that I was extracting
 values in a strange order.

 I now made some 3x3 tests on known results and also with the A * v =
 lambda * v constraint and everything is alright.

 I still have to figure out how to properly use the results to split
 higher elevation parts from lower z parts in a x,y,z dataset, but the
 eigenvectors and values are in the right place now.

 Thanks,
 Andrea




 On Mon, Nov 11, 2013 at 1:21 PM, Thomas Neidhart
 thomas.neidh...@gmail.com wrote:
  On 11/11/2013 11:40 AM, andrea antonello wrote:
  Hi Thomas,
  thanks for your reply.
 
  the result of CM and jama are identical, the difference is just in the
  way how the data is stored.
 
  Afaik in jama calling getV() returns a vector in row format whereas in
  CM the are stored in column format.
 
  If you transpose the matrix (or call getVT()) you will see that the
  vectors are identical, except for the signs and order. The reason for
  this is that for CM the eigenvalues/eigenvectors are sorted in
  descending order in case of a symmetric matrix, which is the case for
  your matrix.
 
  the problems is that the result is not just different in the
  transposed way. And in fact if I pick the getVT,results are still not
  the same.
  The result seems to be reflected on the secondary diagonal, not the
  primary diagonal.
 
  Furthermore, if I use the API, I do not expect to be problems of rows
  and columns, so if I use:
 
  double eigenValue = eigenDecomposition.getRealEigenvalue(i);
  RealVector eigenVector = eigenDecomposition.getEigenvector(i);
 
  I expect the eigenvector and eigenvalue to be the right ones for the
  given index, no matter how the results are given in the matrixes.
 
  But the results I get are, for the same eigenvalue:
  CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
  0.19005937823202243, 0.38864472217295326
  JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
  0.5012101463530931, -0.38864472217295326]
 
  I am still quite puzzled about what I am missing.
 
  I do not know how you get the eigenvector from jama, as there is no
  getEigenVector method.
 
  The class javadoc of their EigenvalueDecomposition states:
 
  columns of V represent the eigenvectors in the sense that A*V = V*D,
  i.e. A.times(V) equals V.times(D).  The matrix V may be badly
  conditioned, or even singular, so the validity of the equation
  A = V*D*inverse(V) depends upon V.cond().
 
  Looking at your example it looks like you took the rows of V to extract
  your eigenvector.
 
  You can also easily verify if your eigenvector is correct:
 
   A * v = lambda * v
 
  Thomas
 
  -
  To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
  For additional commands, e-mail: user-h...@commons.apache.org
 

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org




Re: [math] eigenvector doubts and issues

2013-11-12 Thread andrea antonello
Hi Thomas,

 good to hear.
 Sounds like an interesting problem you are working on.
 If you have found a good solution feel free to come back to use, we might
 want to include it in our user guide how an EigenDecomposition could be
 used.

thanks for pointing this out, I was a bit afraid to go offtopic.
But I would be happy to share a small eigenvector real world example
if there is interest. I will try to put together something on an
accessible wiki page, but I might need some help to finish it for
proper use in documentation :)

I'll keep you posted.
Thanks,
Andrea


 Thanks,

 Thomas



 On Tue, Nov 12, 2013 at 11:31 AM, andrea antonello 
 andrea.antone...@gmail.com wrote:

 Hi Thomas,
 you are definitely right. I was fooled by the nice result of the
 JAMA calculation, not noticing that it is there that I was extracting
 values in a strange order.

 I now made some 3x3 tests on known results and also with the A * v =
 lambda * v constraint and everything is alright.

 I still have to figure out how to properly use the results to split
 higher elevation parts from lower z parts in a x,y,z dataset, but the
 eigenvectors and values are in the right place now.

 Thanks,
 Andrea




 On Mon, Nov 11, 2013 at 1:21 PM, Thomas Neidhart
 thomas.neidh...@gmail.com wrote:
  On 11/11/2013 11:40 AM, andrea antonello wrote:
  Hi Thomas,
  thanks for your reply.
 
  the result of CM and jama are identical, the difference is just in the
  way how the data is stored.
 
  Afaik in jama calling getV() returns a vector in row format whereas in
  CM the are stored in column format.
 
  If you transpose the matrix (or call getVT()) you will see that the
  vectors are identical, except for the signs and order. The reason for
  this is that for CM the eigenvalues/eigenvectors are sorted in
  descending order in case of a symmetric matrix, which is the case for
  your matrix.
 
  the problems is that the result is not just different in the
  transposed way. And in fact if I pick the getVT,results are still not
  the same.
  The result seems to be reflected on the secondary diagonal, not the
  primary diagonal.
 
  Furthermore, if I use the API, I do not expect to be problems of rows
  and columns, so if I use:
 
  double eigenValue = eigenDecomposition.getRealEigenvalue(i);
  RealVector eigenVector = eigenDecomposition.getEigenvector(i);
 
  I expect the eigenvector and eigenvalue to be the right ones for the
  given index, no matter how the results are given in the matrixes.
 
  But the results I get are, for the same eigenvalue:
  CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
  0.19005937823202243, 0.38864472217295326
  JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
  0.5012101463530931, -0.38864472217295326]
 
  I am still quite puzzled about what I am missing.
 
  I do not know how you get the eigenvector from jama, as there is no
  getEigenVector method.
 
  The class javadoc of their EigenvalueDecomposition states:
 
  columns of V represent the eigenvectors in the sense that A*V = V*D,
  i.e. A.times(V) equals V.times(D).  The matrix V may be badly
  conditioned, or even singular, so the validity of the equation
  A = V*D*inverse(V) depends upon V.cond().
 
  Looking at your example it looks like you took the rows of V to extract
  your eigenvector.
 
  You can also easily verify if your eigenvector is correct:
 
   A * v = lambda * v
 
  Thomas
 
  -
  To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
  For additional commands, e-mail: user-h...@commons.apache.org
 

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org



-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-12 Thread Thomas Neidhart
Hi Andrea,

that's exactly what we are looking for, so do not hesitate to ask questions.

Thomas


On Tue, Nov 12, 2013 at 11:54 AM, andrea antonello 
andrea.antone...@gmail.com wrote:

 Hi Thomas,

  good to hear.
  Sounds like an interesting problem you are working on.
  If you have found a good solution feel free to come back to use, we might
  want to include it in our user guide how an EigenDecomposition could be
  used.

 thanks for pointing this out, I was a bit afraid to go offtopic.
 But I would be happy to share a small eigenvector real world example
 if there is interest. I will try to put together something on an
 accessible wiki page, but I might need some help to finish it for
 proper use in documentation :)

 I'll keep you posted.
 Thanks,
 Andrea

 
  Thanks,
 
  Thomas
 
 
 
  On Tue, Nov 12, 2013 at 11:31 AM, andrea antonello 
  andrea.antone...@gmail.com wrote:
 
  Hi Thomas,
  you are definitely right. I was fooled by the nice result of the
  JAMA calculation, not noticing that it is there that I was extracting
  values in a strange order.
 
  I now made some 3x3 tests on known results and also with the A * v =
  lambda * v constraint and everything is alright.
 
  I still have to figure out how to properly use the results to split
  higher elevation parts from lower z parts in a x,y,z dataset, but the
  eigenvectors and values are in the right place now.
 
  Thanks,
  Andrea
 
 
 
 
  On Mon, Nov 11, 2013 at 1:21 PM, Thomas Neidhart
  thomas.neidh...@gmail.com wrote:
   On 11/11/2013 11:40 AM, andrea antonello wrote:
   Hi Thomas,
   thanks for your reply.
  
   the result of CM and jama are identical, the difference is just in
 the
   way how the data is stored.
  
   Afaik in jama calling getV() returns a vector in row format whereas
 in
   CM the are stored in column format.
  
   If you transpose the matrix (or call getVT()) you will see that the
   vectors are identical, except for the signs and order. The reason
 for
   this is that for CM the eigenvalues/eigenvectors are sorted in
   descending order in case of a symmetric matrix, which is the case
 for
   your matrix.
  
   the problems is that the result is not just different in the
   transposed way. And in fact if I pick the getVT,results are still not
   the same.
   The result seems to be reflected on the secondary diagonal, not the
   primary diagonal.
  
   Furthermore, if I use the API, I do not expect to be problems of rows
   and columns, so if I use:
  
   double eigenValue = eigenDecomposition.getRealEigenvalue(i);
   RealVector eigenVector = eigenDecomposition.getEigenvector(i);
  
   I expect the eigenvector and eigenvalue to be the right ones for the
   given index, no matter how the results are given in the matrixes.
  
   But the results I get are, for the same eigenvalue:
   CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
   0.19005937823202243, 0.38864472217295326
   JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
   0.5012101463530931, -0.38864472217295326]
  
   I am still quite puzzled about what I am missing.
  
   I do not know how you get the eigenvector from jama, as there is no
   getEigenVector method.
  
   The class javadoc of their EigenvalueDecomposition states:
  
   columns of V represent the eigenvectors in the sense that A*V =
 V*D,
   i.e. A.times(V) equals V.times(D).  The matrix V may be badly
   conditioned, or even singular, so the validity of the equation
   A = V*D*inverse(V) depends upon V.cond().
  
   Looking at your example it looks like you took the rows of V to
 extract
   your eigenvector.
  
   You can also easily verify if your eigenvector is correct:
  
A * v = lambda * v
  
   Thomas
  
   -
   To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
   For additional commands, e-mail: user-h...@commons.apache.org
  
 
  -
  To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
  For additional commands, e-mail: user-h...@commons.apache.org
 
 

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org




Re: [math] eigenvector doubts and issues

2013-11-12 Thread andrea antonello
Hi Thomas,

 that's exactly what we are looking for, so do not hesitate to ask questions.

ok, I put together a small set of examples, which for now I put on a
wiki of the project I am doing this for (since I have access on that
one): http://code.google.com/p/jgrasstools/wiki/Eigenvectors

Since in the 3d space I do not get the results I was hoping for, it
would be great if some expert could give me a hint.

Let me know what you think.

Andrea




 Thomas


 On Tue, Nov 12, 2013 at 11:54 AM, andrea antonello 
 andrea.antone...@gmail.com wrote:

 Hi Thomas,

  good to hear.
  Sounds like an interesting problem you are working on.
  If you have found a good solution feel free to come back to use, we might
  want to include it in our user guide how an EigenDecomposition could be
  used.

 thanks for pointing this out, I was a bit afraid to go offtopic.
 But I would be happy to share a small eigenvector real world example
 if there is interest. I will try to put together something on an
 accessible wiki page, but I might need some help to finish it for
 proper use in documentation :)

 I'll keep you posted.
 Thanks,
 Andrea

 
  Thanks,
 
  Thomas
 
 
 
  On Tue, Nov 12, 2013 at 11:31 AM, andrea antonello 
  andrea.antone...@gmail.com wrote:
 
  Hi Thomas,
  you are definitely right. I was fooled by the nice result of the
  JAMA calculation, not noticing that it is there that I was extracting
  values in a strange order.
 
  I now made some 3x3 tests on known results and also with the A * v =
  lambda * v constraint and everything is alright.
 
  I still have to figure out how to properly use the results to split
  higher elevation parts from lower z parts in a x,y,z dataset, but the
  eigenvectors and values are in the right place now.
 
  Thanks,
  Andrea
 
 
 
 
  On Mon, Nov 11, 2013 at 1:21 PM, Thomas Neidhart
  thomas.neidh...@gmail.com wrote:
   On 11/11/2013 11:40 AM, andrea antonello wrote:
   Hi Thomas,
   thanks for your reply.
  
   the result of CM and jama are identical, the difference is just in
 the
   way how the data is stored.
  
   Afaik in jama calling getV() returns a vector in row format whereas
 in
   CM the are stored in column format.
  
   If you transpose the matrix (or call getVT()) you will see that the
   vectors are identical, except for the signs and order. The reason
 for
   this is that for CM the eigenvalues/eigenvectors are sorted in
   descending order in case of a symmetric matrix, which is the case
 for
   your matrix.
  
   the problems is that the result is not just different in the
   transposed way. And in fact if I pick the getVT,results are still not
   the same.
   The result seems to be reflected on the secondary diagonal, not the
   primary diagonal.
  
   Furthermore, if I use the API, I do not expect to be problems of rows
   and columns, so if I use:
  
   double eigenValue = eigenDecomposition.getRealEigenvalue(i);
   RealVector eigenVector = eigenDecomposition.getEigenvector(i);
  
   I expect the eigenvector and eigenvalue to be the right ones for the
   given index, no matter how the results are given in the matrixes.
  
   But the results I get are, for the same eigenvalue:
   CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
   0.19005937823202243, 0.38864472217295326
   JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
   0.5012101463530931, -0.38864472217295326]
  
   I am still quite puzzled about what I am missing.
  
   I do not know how you get the eigenvector from jama, as there is no
   getEigenVector method.
  
   The class javadoc of their EigenvalueDecomposition states:
  
   columns of V represent the eigenvectors in the sense that A*V =
 V*D,
   i.e. A.times(V) equals V.times(D).  The matrix V may be badly
   conditioned, or even singular, so the validity of the equation
   A = V*D*inverse(V) depends upon V.cond().
  
   Looking at your example it looks like you took the rows of V to
 extract
   your eigenvector.
  
   You can also easily verify if your eigenvector is correct:
  
A * v = lambda * v
  
   Thomas
  
   -
   To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
   For additional commands, e-mail: user-h...@commons.apache.org
  
 
  -
  To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
  For additional commands, e-mail: user-h...@commons.apache.org
 
 

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org



-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-12 Thread Thomas Neidhart
On 11/12/2013 06:31 PM, andrea antonello wrote:
 Hi Thomas,
 
 that's exactly what we are looking for, so do not hesitate to ask questions.
 
 ok, I put together a small set of examples, which for now I put on a
 wiki of the project I am doing this for (since I have access on that
 one): http://code.google.com/p/jgrasstools/wiki/Eigenvectors
 
 Since in the 3d space I do not get the results I was hoping for, it
 would be great if some expert could give me a hint.
 
 Let me know what you think.

Hi Andrea,

are you sure that your eigenvectors are drawn correctly?

I tried to reproduce your results with geogebra, and in my case the
plane defined by the two eigenvectors with the least eigenvalue seems to
split the values quite well, see this image:

http://people.apache.org/~tn/pca.png

The results from the computations:

covariance:
  {0.7272727273,0.0,0.1818181818},
  {0.0,0.3409090909,0.2272727273},
  {0.1818181818,0.2272727273,0.2727272727}}

lambda1 =  0.8056498828134406
v1 =  {0.9015723558; 0.1900593782; 0.3886447222}

lambda2 = 0.4874287594020183
v2 = {-0.3799516758; 0.7774478203; 0.5012101464}


lambda3 = 0.04783044869363171
v3 = {-0.2068913033; -0.5995434259; 0.7731388421}

The plane is constructed with v2 and v3.

Thomas

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-11 Thread andrea antonello
Hi Thomas,
thanks for your reply.

 the result of CM and jama are identical, the difference is just in the
 way how the data is stored.

 Afaik in jama calling getV() returns a vector in row format whereas in
 CM the are stored in column format.

 If you transpose the matrix (or call getVT()) you will see that the
 vectors are identical, except for the signs and order. The reason for
 this is that for CM the eigenvalues/eigenvectors are sorted in
 descending order in case of a symmetric matrix, which is the case for
 your matrix.

the problems is that the result is not just different in the
transposed way. And in fact if I pick the getVT,results are still not
the same.
The result seems to be reflected on the secondary diagonal, not the
primary diagonal.

Furthermore, if I use the API, I do not expect to be problems of rows
and columns, so if I use:

double eigenValue = eigenDecomposition.getRealEigenvalue(i);
RealVector eigenVector = eigenDecomposition.getEigenvector(i);

I expect the eigenvector and eigenvalue to be the right ones for the
given index, no matter how the results are given in the matrixes.

But the results I get are, for the same eigenvalue:
CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
0.19005937823202243, 0.38864472217295326
JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
0.5012101463530931, -0.38864472217295326]

I am still quite puzzled about what I am missing.
Thanks,
Andrea





 Thomas

 -
 To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
 For additional commands, e-mail: user-h...@commons.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-11 Thread Thomas Neidhart
On 11/11/2013 11:40 AM, andrea antonello wrote:
 Hi Thomas,
 thanks for your reply.
 
 the result of CM and jama are identical, the difference is just in the
 way how the data is stored.

 Afaik in jama calling getV() returns a vector in row format whereas in
 CM the are stored in column format.

 If you transpose the matrix (or call getVT()) you will see that the
 vectors are identical, except for the signs and order. The reason for
 this is that for CM the eigenvalues/eigenvectors are sorted in
 descending order in case of a symmetric matrix, which is the case for
 your matrix.
 
 the problems is that the result is not just different in the
 transposed way. And in fact if I pick the getVT,results are still not
 the same.
 The result seems to be reflected on the secondary diagonal, not the
 primary diagonal.
 
 Furthermore, if I use the API, I do not expect to be problems of rows
 and columns, so if I use:
 
 double eigenValue = eigenDecomposition.getRealEigenvalue(i);
 RealVector eigenVector = eigenDecomposition.getEigenvector(i);
 
 I expect the eigenvector and eigenvalue to be the right ones for the
 given index, no matter how the results are given in the matrixes.
 
 But the results I get are, for the same eigenvalue:
 CM: eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
 0.19005937823202243, 0.38864472217295326
 JAMA: eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
 0.5012101463530931, -0.38864472217295326]
 
 I am still quite puzzled about what I am missing.

I do not know how you get the eigenvector from jama, as there is no
getEigenVector method.

The class javadoc of their EigenvalueDecomposition states:

columns of V represent the eigenvectors in the sense that A*V = V*D,
i.e. A.times(V) equals V.times(D).  The matrix V may be badly
conditioned, or even singular, so the validity of the equation
A = V*D*inverse(V) depends upon V.cond().

Looking at your example it looks like you took the rows of V to extract
your eigenvector.

You can also easily verify if your eigenvector is correct:

 A * v = lambda * v

Thomas

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



[math] eigenvector doubts and issues

2013-11-09 Thread andrea antonello
Dear all,
I have a doubt about using the eigenvector part of the library.

I created a small dataset to represent 3d coordinates in a cartesian plane:

double[] x = {1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3};
double[] y = {0.5, 1, 1.5, 2, 0.5, 1, 1.5, 2, 0.5, 1, 1.5, 2};
double[] z = {1, 1, 1, 2, 1, 1, 2, 2, 1, 2, 2, 2};

The datasset represents a step from z value 1 to 2 on a regular grid
(with a diagonali trend).

I would expect to gain from the eigenvector with lowest eigenvalue a
line splitting this particular set in a quite clean way the higher z
points from the lower ones.

So I calculate the covariance matrix which results in:
0.7272727272727273 0.0 0.18181818181818182
0.0 0.3409090909090909 0.22727272727272727
0.18181818181818182 0.22727272727272727 0.2727272727272727

and then I simply calculate the eigenvector/values which result in:

eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
0.19005937823202243, 0.38864472217295326]
eigenVal: 0.4874287594020183, eigenVect: [-0.37995167578226796,
0.7774478202831089, 0.5012101463530935]
eigenVal: 0.04783044869363171, eigenVect: [-0.20689130333844696,
-0.5995434258526233, 0.773138842071603]

doing exactly the same thing with Jama results in:

eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
0.5012101463530931, -0.38864472217295326]
eigenVal: 0.48742875940201863, eigenVect: [0.5995434258526229,
0.7774478202831089, -0.1900593782320223]
eigenVal: 0.0478304486936319, eigenVect: [0.20689130333844694,
-0.37995167578226785, -0.9015723557614027]

In fact if I use Jama's eigenvector with lowest eigenvalue, I am able
to construct a line of slope y =
(0.206891303338447/-0.379951675782268) *x, which splits my dataset the
way I would like to have it.
The same doesn't apply to the result of the apache commons math lib,
which seems to be reflected on the secondary diagonal.

Since I am no expert in this field, I might be doing somthing really
wrong. If someone could give me a hint, it would be greatly
appreciated.

Best regards,
Andrea

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org



Re: [math] eigenvector doubts and issues

2013-11-09 Thread Thomas Neidhart
On 11/09/2013 10:38 AM, andrea antonello wrote:
 Dear all,
 I have a doubt about using the eigenvector part of the library.
 
 I created a small dataset to represent 3d coordinates in a cartesian plane:
 
 double[] x = {1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3};
 double[] y = {0.5, 1, 1.5, 2, 0.5, 1, 1.5, 2, 0.5, 1, 1.5, 2};
 double[] z = {1, 1, 1, 2, 1, 1, 2, 2, 1, 2, 2, 2};
 
 The datasset represents a step from z value 1 to 2 on a regular grid
 (with a diagonali trend).
 
 I would expect to gain from the eigenvector with lowest eigenvalue a
 line splitting this particular set in a quite clean way the higher z
 points from the lower ones.
 
 So I calculate the covariance matrix which results in:
 0.7272727272727273 0.0 0.18181818181818182
 0.0 0.3409090909090909 0.22727272727272727
 0.18181818181818182 0.22727272727272727 0.2727272727272727
 
 and then I simply calculate the eigenvector/values which result in:
 
 eigenVal: 0.8056498828134406, eigenVect: [0.9015723557614027,
 0.19005937823202243, 0.38864472217295326]
 eigenVal: 0.4874287594020183, eigenVect: [-0.37995167578226796,
 0.7774478202831089, 0.5012101463530935]
 eigenVal: 0.04783044869363171, eigenVect: [-0.20689130333844696,
 -0.5995434258526233, 0.773138842071603]
 
 doing exactly the same thing with Jama results in:
 
 eigenVal: 0.8056498828134406, eigenVect: [-0.7731388420716028,
 0.5012101463530931, -0.38864472217295326]
 eigenVal: 0.48742875940201863, eigenVect: [0.5995434258526229,
 0.7774478202831089, -0.1900593782320223]
 eigenVal: 0.0478304486936319, eigenVect: [0.20689130333844694,
 -0.37995167578226785, -0.9015723557614027]
 
 In fact if I use Jama's eigenvector with lowest eigenvalue, I am able
 to construct a line of slope y =
 (0.206891303338447/-0.379951675782268) *x, which splits my dataset the
 way I would like to have it.
 The same doesn't apply to the result of the apache commons math lib,
 which seems to be reflected on the secondary diagonal.
 
 Since I am no expert in this field, I might be doing somthing really
 wrong. If someone could give me a hint, it would be greatly
 appreciated.

Hi Andrea,

the result of CM and jama are identical, the difference is just in the
way how the data is stored.

Afaik in jama calling getV() returns a vector in row format whereas in
CM the are stored in column format.

If you transpose the matrix (or call getVT()) you will see that the
vectors are identical, except for the signs and order. The reason for
this is that for CM the eigenvalues/eigenvectors are sorted in
descending order in case of a symmetric matrix, which is the case for
your matrix.

Thomas

-
To unsubscribe, e-mail: user-unsubscr...@commons.apache.org
For additional commands, e-mail: user-h...@commons.apache.org