Hello, I have the following doubts related with lm.ridge. I take the
Longley example.
First: I think coefficients from lm(Employed~.,data=longley) should be
equal coefficients from lm.ridge(Employed~.,data=longley, lambda=0) why
it does not happen?
Second: if I have for example
Ridge<-lm.ridge(Em
hello all
some help required once again!
does anyone recall the equations for the following ridge constants?
1. hoerl and kennard (1970)
2. hoerl, kennard and baldwin (1975)
3. lawless and wang
could you also specify whether or not one has to transform the X and Y
variables. if so , how and in w
hi Andy and other r users
i never gave the full picture.
beta(j)= std(y)*betaridge(j)/std(x(j)) for j=1,2,...p
but beta(0) = ybar- sum( i= 1 to p, betaridge(i)*xbar(j) )
note that ybar and the xbars are estimated parameters.
we can split the covariance matrix into three sections namely:
1.
If I'm not mistaken, you only need to know that if V is the covariance
matrix of a random vector X, then the covariance of the linear
transformation AX + b is AVA'. Substitute betahat for X, and figure out
what A is and you're set. (b is 0 in your case.)
Andy
> From: Clark Allan
>
> hi all
>
hi all
a technical question for those bright statisticians.
my question involves ridge regression.
definition:
n=sample size of a data set
X is the matrix of data with , say p variables
Y is the y matrix i.e the response variable
Z(i,j) = ( X(i,j)- xbar(j) / [ (n-1)^0.5* std(x(j))]
Y_new(i
Hi Frank,
> From: Frank E Harrell Jr [mailto:[EMAIL PROTECTED]
[snip]
> The anova method for ols fits 'works' when you penalize the
> model but there is some controversy over whether we should be
> testing biased coefficients. Some believe that hypothesis
> tests should be done using the unp
On Thu, 05 Jun 2003 21:50:13 -0400
"Liaw, Andy" <[EMAIL PROTECTED]> wrote:
> Hi Frank,
>
> > From: Frank E Harrell Jr [mailto:[EMAIL PROTECTED]
>
> [snip]
>
> > The anova method for ols fits 'works' when you penalize the
> > model but there is some controversy over whether we should be
> > te
On Thu, 5 Jun 2003 15:40:52 +
"Wegmann (LIST)" <[EMAIL PROTECTED]> wrote:
> Hello R-user
>
> I want to compute a multiple regression but I would to include a check for
> collinearity of the variables. Therefore I would like to use a ridge
> regression.
> I tried lm.ridge() but I don't know
Hello R-user
I want to compute a multiple regression but I would to include a check for
collinearity of the variables. Therefore I would like to use a ridge
regression.
I tried lm.ridge() but I don't know yet how to get p-values (single Pr() and p
of the whole model) out of this model. Can any