C R wrote:
> Hi
> 
> I have a problem where I am finding that I cannot invert covariance matrices
> because they are ill-conditioned. I have been told that I can 'condition'
> these matrices by adding to their diagonals. My questions are:
> i) what should I add (do I add the same to each element of the diagonal, or
> different quantities and in such a case how do I decide)?
> ii) when do I decide to perform such conditioning (e.g. is there a common
> rule of thumb regarding the condition number etc.)?

Why would you want to invert a covariance matrix? Are you trying to 
perform regression? Are you trying to perform some other analysis?

If the covariance matrix is ill-conditioned and you want to do 
regression, then ridge-regression, which basically adds a constant 
lambda to the diagonal -- to be specific, (X'X + lambda*I)**(-1) is used 
in place of (X'X)**(-1) is a way to deal with this situation. The value 
of lambda is often found by an search process to see what works well, 
and gives desirable regression properties. But this method won't give 
you the inverse of the covariance matrix needed for other analyses.

You can also use the generalized inverse in such situations, this is 
equivalent to finding the inverse of a covariance matrix that has been 
reconstructed from only the eigenvectors where the eigenvalue is greater 
than zero. Often times this works well.

But the anwer depends on why you need to invert this matrix.

-- 
Paige Miller
[EMAIL PROTECTED]
http://www.kodak.com

"It's nothing until I call it!" -- Bill Klem, NL Umpire
"When you get the choice to sit it out or dance, I hope you dance" -- 
Lee Ann Womack

.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to