Hi,

On Jul 22, 2009, at 6:18 PM, lulu9797 wrote:

The returned values of lars function include R squares along the variable
selection path.

Correct.

However, such values are always slightly different from the
R squares returned by the regression function lm using the same models.
Anyone know the reasons?

How are you comparing the models from lars vs. lm?

Are you just using the non-zero coefs you get from lars in setting up your formula for lm (or something)?

How different is "slightly different"?

No idea/speculation: if you're comparing the "dropped coefficients" models to each other, perhaps the coefs on your predictors are slightly different in lars vs. lm since you've got the l1-penalty on them?

Perhaps a guru will respond with better insight.

Very important, and needs quick answers.

As a point of etiquette, I reckon most people don't really respond too well to requests like these since posting to this mailing list is essentially asking for free help. If it's so urgent/important to you, you can get some professional-for-hire to drop whatever s/he is doing at the moment and work it all out for you.

By the by, you might want to look at the glmnet package if you find yourself using lars often. Setting the glmnet alpha parameter to 1 essentially gives you the lasso regularization path and I've found it to be quite a bit faster than lars.

-steve

--
Steve Lianoglou
Graduate Student: Physiology, Biophysics and Systems Biology
Weill Medical College of Cornell University

Contact Info: http://cbio.mskcc.org/~lianos/contact

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to