Alex Roy <alexroy2...@gmail.com> [Sun, Jun 14, 2009 at 06:43:52AM CEST]:
> Hi Jiim, Thanks . I want to do the following:
> 
> 1. each time I need to drop one column, say first column 1 from matrix   X.
> 2 then take out row 1 of the remainning  matrix and that row becomes
> response (y)
> 3. do lasso regression on remaining X to y.
> 4. store the coefficients
> 
> Similarly, in next run
> 
>  1.  I need to drop 2nd column,  from matrix   X.
> 2 then take out row 2 of the remainning  matrix and that row becomes
> response (y)
> 3. do lasso regression on remaining X ( in example: X2to y.)
> 4. store the coefficients

You may have reasons you want to do this, to me it looks a bit peculiar,
but then I am not too much an expert on penalized regression.

Care to share some thoughts on the theory behind what you are doing?

The following may work (I did not bother to install chemometrics, so untested):

library(lars)
library(chemometrics)

nr <- 50

X <- matrix(rnorm(nr**nr),ncol=nr)

sapply(1:nr, function(i) {
    lasso_res=lassoCV(X[i] ~ 
X[-i],data=data1,K=10,fraction=seq(0.1,1,by=0.1),use.Gram=FALSE)
    # to get optimum value of Cross Validation
    lasso_coef=lassocoef(X[i] ~ 
X[-i],data=data1,sopt=lasso_res$sopt,use.Gram=FALSE)})


-- 
Johannes Hüsing               There is something fascinating about science. 
                              One gets such wholesale returns of conjecture 
mailto:johan...@huesing.name  from such a trifling investment of fact.          
      
http://derwisch.wikidot.com         (Mark Twain, "Life on the Mississippi")

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to