[R] PRESS and P2 statistics in R

2010-05-25 Thread Alex Roy
Hello all,
   Is there any function in R by which I can calculate PRESS and
P2 statistics for linear regression in R?

Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Caret package and lasso

2010-04-08 Thread Alex Roy
Dear Max,
Thanks for the reply. I will wait for your further comment
on this.

Regards

Linda Garcia

On Wed, Apr 7, 2010 at 8:03 PM, Max Kuhn mxk...@gmail.com wrote:

 Linda,

 Thanks for the example.

 I did this to make it more reproducible:

  set.seed(1)
  X-matrix(rnorm(50*100),nrow=50)
  y-rnorm(50*1)

  dimnames(X)

  colnames(X) - paste(V, 1:nrow(X))

  # Applying caret package

  set.seed(2)
  con-trainControl(method=cv,number=10)

  data-NULL
  data- train(X,y, lasso, metric=RMSE,tuneLength = 10, trControl = con)

 I see your point here, but this code gives the same results:

  fit2 - enet(X, y, lambda = 0)
  predict(fit2, mode = fraction, s = data$bestTune$.fraction, type =
 coefficient)$coef

 (at least train() names the predictors).

 To me, it looks like enet is doing some filtering:

dim(X)
   [1]  50 100
length(fit2$meanx)
   [1] 56

 This appears to be independent of caret. I would contact the package
 maintainer off-list and ask.

 Max

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Biclustering package

2010-02-11 Thread Alex Roy
Hello,
   I am looking for R package which can perform biclustering a part
from biclust package.

thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Biclustering package

2010-02-11 Thread Alex Roy
Thank you very much Gabor.

Alex

On Thu, Feb 11, 2010 at 10:55 AM, Gábor Csárdi csa...@rmki.kfki.hu wrote:

 Alex, the isa2 package implements the biclustering algorithm discussed in
 Bergmann S, Ihmels J, and Barkai N. Iterative signature algorithm for
 the analysis of large-scale gene expression data. Phys Rev E Stat
 Nonlin Soft Matter Phys 2003 Mar; 67(3 Pt 1) 031902

 Best,
 Gabor

 On Thu, Feb 11, 2010 at 10:51 AM, Alex Roy alexroy2...@gmail.com wrote:
  Hello,
I am looking for R package which can perform biclustering a
 part
  from biclust package.
 
  thanks
 
  Alex
 
 [[alternative HTML version deleted]]
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.
 



 --
 Gabor Csardi gabor.csa...@unil.ch UNIL DGM


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] glmnet in caret packge

2010-01-25 Thread Alex Roy
Dear all,
  I want to train my model with LASSO using caret package
(glmnet). So, in glmnet, there are two parameters, alpha and lambda. How can
I fix my alpha=1 to get a lasso model?



con-trainControl(method=cv,number=10)

model - train(X, y, glmnet, metric=RMSE,tuneLength = 10, trControl =
con)



Thanks

Alex Roy

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How can I store the results

2010-01-13 Thread Alex Roy
Dear R users,
I am running a R code which gives me 10 columns and
160 rows. I need to run the code for 100 times and each time I need to store
the results in a single file.
I do not know how can I store them in a single file without over writting
the results?

Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] penalization regression

2010-01-07 Thread Alex Roy
Dear all,
   I am using penalization regression method for my data. I am
wandering the following names are synonymous or not?

Complexity parameter
Penalty parameter
Shrinkage factor
Shrinkage parameter
hyper parameter


Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Elastic net in R (enet package)

2009-08-25 Thread Alex Roy
Dear R users,
I am using enet package in R for applying elastic
net method. In elastic net, two penalities are applied one is lambda1 for
LASSO and lambda2 for ridge ( zou, 2005) penalty. But while running the
analysis, I realised tht, I  optimised only one lambda. ( even when I
looked at the example in R, they used only one penality)  So, I am wandering
which penalty they are referring to? Is it a combination of penalties or one
of them. I read the paper of zou and hastie but still in doubt.

Thanks in advance

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Permutation test and R2 problem

2009-08-14 Thread Alex Roy
Hi,


I have optimized the shrinkage parameter (GCV)for ridge  and  got my r2
value is 70% . to check the sensitivity of the result, I did permutation
test. I permuted the response vector and run for 1000 times and draw a
distribution. But now, I get r2 values highest 98% and some of them more
than 70 %. Is it expected from such type of test?



*I was under impression that, r2 with real data set will always maximum! And
permutation will not be effected i.e. permuted r2 will always less than real
one! *

**

thanks a lot



Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Rank of matrix

2009-08-04 Thread Alex Roy
Dear all,
 Rank of a matrix depends on which factors? Only on rows or
coumns?  or both ? If there is a collinearlity in the variables ( columns )
does it effects the rank?



 X-matrix((rnorm(1)),50)
 dim(X)
[1]  50 200
 qr(X)$rank
[1] 50
 X[,2]-X[,30]
 qr(X)$rank
[1] 50
 X[10,]-X[7,]
 qr(X)$rank
[1] 49

Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Collinearity in Linear Multiple Regression

2009-07-21 Thread Alex Roy
Dear all,
  How can I test for collinearity in the predictor data set
for multiple linear regression.

Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Linear Regression Problem

2009-07-14 Thread Alex Roy
Dear All,
 I have a matrix  say, X ( 100 X 40,000) and a vector say, y
(100 X 1) . I want to perform linear regression. I have scaled  X matrix by
using scale () to get mean zero and s.d 1  . But still I get very high
values of regression coefficients.  If I scale X matrix, then the regression
coefficients will bahave as a correlation coefficient and they should not be
more than 1. Am I right? I do not whats going wrong.
Thanks for your help.
Alex


*Code:*

UniBeta - sapply(1:dim(X)[2], function(k)
+ summary(lm(y~X[,k]))$coefficients[2,1])

pval - sapply(1:dim(X)[2], function(l)
+ summary(lm(y~X[,l]))$coefficients[2,4])

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Linear Regression Problem

2009-07-14 Thread Alex Roy
Dear Vito,
Thanks for your comments. But I want to do Simple linear
regression not Multiple Linear regression. Multiple Linear regression is not
possible here as number of variables are much more than samples.( X is ill
condioned, inverse of X^TX does not exist! )
I just want to take one predictor variable and regress on y and store
regression coefficients, p values and R^2 values. And the loop go up to
40,000 predictors.

Alex
On Tue, Jul 14, 2009 at 5:18 PM, Vito Muggeo (UniPa)
vito.mug...@unipa.itwrote:

 dear Alex,
 I think your problem with a large number of predictors and a relatively
 small number of subjects may be faced via some regularization approach
 (ridge or lasso regression..)

 hope this helps you,
 vito

 Alex Roy ha scritto:

  Dear All,
 I have a matrix  say, X ( 100 X 40,000) and a vector say,
 y
 (100 X 1) . I want to perform linear regression. I have scaled  X matrix
 by
 using scale () to get mean zero and s.d 1  . But still I get very high
 values of regression coefficients.  If I scale X matrix, then the
 regression
 coefficients will bahave as a correlation coefficient and they should not
 be
 more than 1. Am I right? I do not whats going wrong.
 Thanks for your help.
 Alex


 *Code:*

 UniBeta - sapply(1:dim(X)[2], function(k)
 + summary(lm(y~X[,k]))$coefficients[2,1])

 pval - sapply(1:dim(X)[2], function(l)
 + summary(lm(y~X[,l]))$coefficients[2,4])

[[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


 --
 
 Vito M.R. Muggeo
 Dip.to Sc Statist e Matem `Vianelli'
 Università di Palermo
 viale delle Scienze, edificio 13
 90128 Palermo - ITALY
 tel: 091 6626240
 fax: 091 485726/485612
 http://dssm.unipa.it/vmuggeo
 


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Linear Regression Problem

2009-07-14 Thread Alex Roy
Dear Dr. Ravi Varadhan,
Thanks for your comments. Here, variables (p) are in columns and samples are
in rows(n). And I want to find out significant variables associated with
response (y).
The reason why I said multiple linear regression (MLR) is not possible :
MLR or classical MLR developed with a philosophy where nr. of samples
(n) are more than variables(p). And predictors are uncorrelated.
 So, I do not consider penalization/shrinkage/regularaization methods as a
traditional regression methods such as MLR.
The solutions you suggested, I am completely agree with it , even I can  add
some other techniques like Elastic net, Partial Least squares, Principal
component regression etc or may be machine learning methods like Support
vector regression or Random forest regression to get done my job.
But I want to do Univarite Method ( like Simple Linear regression ) for some
purpose.

Regards

Alex
On Tue, Jul 14, 2009 at 5:48 PM, Ravi Varadhan rvarad...@jhmi.edu wrote:

 I am not sure that you really want to do separate regressions for each row
 of X, with the same y.  This does not make much sense.

 Why do you think multiple linear regression is not possible just because
 X'X
 is not invertible?  You have 2 main options here:

 1.  Obtain a minimum-norm solution using SVD (also known as Moore-Penrose
 inverse). This solution minimizes ||y - Xb|| subject to minimum ||b||
 2.  Obtain a regularized solution such as the ridge-regression, as Vito
 suggested.

 You can do (1) as follows:

require(MASS)

soln - ginv(X, y)

 Here is an example:

X - matrix(rnorm(1000), 10, 100)  # matrix with rank = 10

b - rep(1, 100)

y - crossprod(t(X), b)

soln - c(ginv(X) %*% y)  # this will not be close to b

 Hope this helps,
 Ravi.



 
 ---

 Ravi Varadhan, Ph.D.

 Assistant Professor, The Center on Aging and Health

 Division of Geriatric Medicine and Gerontology

 Johns Hopkins University

 Ph: (410) 502-2619

 Fax: (410) 614-9625

 Email: rvarad...@jhmi.edu

 Webpage:

 http://www.jhsph.edu/agingandhealth/People/Faculty_personal_pages/Varadhan.h
 tml




 
 


 -Original Message-
 From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
 On
 Behalf Of Alex Roy
 Sent: Tuesday, July 14, 2009 11:29 AM
 To: Vito Muggeo (UniPa)
 Cc: r-help@r-project.org
 Subject: Re: [R] Linear Regression Problem

  Dear Vito,
Thanks for your comments. But I want to do Simple linear
 regression not Multiple Linear regression. Multiple Linear regression is
 not
 possible here as number of variables are much more than samples.( X is ill
 condioned, inverse of X^TX does not exist! ) I just want to take one
 predictor variable and regress on y and store regression coefficients, p
 values and R^2 values. And the loop go up to 40,000 predictors.

 Alex
 On Tue, Jul 14, 2009 at 5:18 PM, Vito Muggeo (UniPa)
 vito.mug...@unipa.itwrote:

  dear Alex,
  I think your problem with a large number of predictors and a
  relatively small number of subjects may be faced via some
  regularization approach (ridge or lasso regression..)
 
  hope this helps you,
  vito
 
  Alex Roy ha scritto:
 
   Dear All,
  I have a matrix  say, X ( 100 X 40,000) and a vector
  say, y (100 X 1) . I want to perform linear regression. I have scaled
  X matrix by using scale () to get mean zero and s.d 1  . But still I
  get very high values of regression coefficients.  If I scale X
  matrix, then the regression coefficients will bahave as a correlation
  coefficient and they should not be more than 1. Am I right? I do not
  whats going wrong.
  Thanks for your help.
  Alex
 
 
  *Code:*
 
  UniBeta - sapply(1:dim(X)[2], function(k)
  + summary(lm(y~X[,k]))$coefficients[2,1])
 
  pval - sapply(1:dim(X)[2], function(l)
  + summary(lm(y~X[,l]))$coefficients[2,4])
 
 [[alternative HTML version deleted]]
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
 
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 http://www.r-project.org/posting
 -guide.html
  and provide commented, minimal, self-contained, reproducible code.
 
 
  --
  
  Vito M.R. Muggeo
  Dip.to Sc Statist e Matem `Vianelli'
  Universit` di Palermo
  viale delle Scienze, edificio 13
  90128 Palermo - ITALY
  tel: 091 6626240
  fax: 091 485726/485612
  http://dssm.unipa.it/vmuggeo
  
 

[[alternative HTML version deleted]]




[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting

Re: [R] Random Forest Variable Importance Interpretation

2009-07-06 Thread Alex Roy
Hi,
  Are you looking for variable selection? If this is the case than you
can use LASSO, Elastic net, Sparse PLS regression methods which encourages
variable selection. PCA does not select variables as you get all your
variables in the PCs. You can sparse PCA.

Regards

Alex

On Wed, Jun 24, 2009 at 6:04 PM, lara harrup (IAH-P) 
lara.har...@bbsrc.ac.uk wrote:

 Hi



 I am trying to explore the use of random forests for regression to
 identify the important environmental/microclimate variables involved in
 predicting the abundance of a species in different habitats, there are
 approx 40 variable and between 200 and 500 data points depending on the
 dataset. I have successfully used the randomForest package to conduct
 the analysis and looked at the %IncMSE and IncNodeImpurity values given
 by calling and plotting these out and have looked at the partial
 dependence plots for the different variables effect of the response but
 I have been looking though the literature to see how people have
 previously used this type of analysis and I would like to be able to
 plot out the overall variable importance in some form of PCA Scree graph
 but havn't got a clue how to even start this so any suggestions will be
 most appreciated?



 Many thanks in advance



 Lara


[[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How to save multiple images??

2009-06-19 Thread Alex Roy
Dear all,
  How can I save multiple images in my working directory?? I
used save.image() but could not succeeded.

Thanks in advance

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How to write loop

2009-06-13 Thread Alex Roy
Dear all,
  I want to do the following process as a loop ( to run
automatically with dimension of X, here 50). How can I do that? Your
cooments will be highly appreciable.

Alex


*# Code:*

library(lars)
library(chemometrics)

 X-matrix(rnorm(2500),ncol=50)
 dim(X)
# [1] 50 50
X1-X[,2:dim(X)[2]] # I have taken out first column
 dim(X1)
#[1] 50 49
  X2-X1[2:dim(X1)[1],] # new X2 is constructed
 dim(X2)
#[1] 49 49
 y-as.matrix(X1[1,]) # Now first row of the X1 acts  a response vector
dim(y)
# [1] 49  1

# application of LASSO regression where y is response and X2 is a design
matrix

data1-data.frame(y,X2=I(X2))
lasso_res=lassoCV(y~X2,data=data1,K=10,fraction=seq(0.1,1,by=0.1),use.Gram=FALSE)
# to get optimum value of Cross Validation
lasso_coef=lassocoef(y~X2,data=data1,sopt=lasso_res$sopt,use.Gram=FALSE)   #
to get the coefficients

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] How to write loop

2009-06-13 Thread Alex Roy
Hi Jiim, Thanks . I want to do the following:

1. each time I need to drop one column, say first column 1 from matrix   X.
2 then take out row 1 of the remainning  matrix and that row becomes
response (y)
3. do lasso regression on remaining X to y.
4. store the coefficients

Similarly, in next run

 1.  I need to drop 2nd column,  from matrix   X.
2 then take out row 2 of the remainning  matrix and that row becomes
response (y)
3. do lasso regression on remaining X ( in example: X2to y.)
4. store the coefficients


repeat
On Sat, Jun 13, 2009 at 7:19 PM, jim holtman jholt...@gmail.com wrote:

 It is not exactly clear what you want to iterate on.  What is going to be
 changing each time through the loop?

   On Sat, Jun 13, 2009 at 10:09 AM, Alex Roy alexroy2...@gmail.comwrote:

  Dear all,
  I want to do the following process as a loop ( to run
 automatically with dimension of X, here 50). How can I do that? Your
 cooments will be highly appreciable.

 Alex


 *# Code:*

 library(lars)
 library(chemometrics)

  X-matrix(rnorm(2500),ncol=50)
  dim(X)
 # [1] 50 50
 X1-X[,2:dim(X)[2]] # I have taken out first column
  dim(X1)
 #[1] 50 49
  X2-X1[2:dim(X1)[1],] # new X2 is constructed
  dim(X2)
 #[1] 49 49
  y-as.matrix(X1[1,]) # Now first row of the X1 acts  a response vector
 dim(y)
 # [1] 49  1

 # application of LASSO regression where y is response and X2 is a design
 matrix

 data1-data.frame(y,X2=I(X2))

 lasso_res=lassoCV(y~X2,data=data1,K=10,fraction=seq(0.1,1,by=0.1),use.Gram=FALSE)
 # to get optimum value of Cross Validation
 lasso_coef=lassocoef(y~X2,data=data1,sopt=lasso_res$sopt,use.Gram=FALSE)
 #
 to get the coefficients

[[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




 --
 Jim Holtman
 Cincinnati, OH
 +1 513 646 9390

 What is the problem that you are trying to solve?


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Heatmap

2009-06-08 Thread Alex Roy
Hello Group,
How can I draw heatmap with variable names in the plot?

Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Caret package: coeffcients for regression

2009-05-04 Thread Alex Roy
Dear All,
  I am using Caretpackage for SVM regression and elastic net
regression . I can get the final fiited vs observed values. How can I get
the coefficients? Any ideas?

Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] loop problem for extract coefficients

2009-04-05 Thread Alex Roy
Dear R users,
  I have problem with extracting coefficients from a
object. Here, X (predictor)and Y (response) are two matrix , I am regressing
X ( dimensions 10 x 20) on each of columns of Y[,1] (10 x 1)  and want to
store the coefficient values. I have performed a Elastic Net regression and
I want to store the coeffcients in each iteration. I got an error message .
I do not know where is the problem Please help me.

Thanks



*Code:*

---
library(elasticnet)
X-matrix(rnorm(200),ncol=20)
Y-matrix(rnorm(200),ncol=20)
loop - 20
size - 20
enres-matrix(nrow = size, ncol = loop)
fit-matrix(nrow = size, ncol = loop)
store-matrix(nrow = size, ncol = loop)
for(j in 1: 10)
print (paste(j,/200,sep=))
{
enres-enet(x=X,y=Y[,j],lambda=1,normalize=TRUE,intercept=TRUE)
fit-predict.enet(enres, X, type=coefficients)
store[,j]-fit$coefficients
}


 library(elasticnet)
Loading required package: lars
 X-matrix(rnorm(200),ncol=20)
 Y-matrix(rnorm(200),ncol=20)

 loop - 20
 size - 20

 enres-matrix(nrow = size, ncol = loop)
 fit-matrix(nrow = size, ncol = loop)
 store-matrix(nrow = size, ncol = loop)

 for(j in 1: 10)
+ print (paste(j,/200,sep=))
[1] 1/200
[1] 2/200
[1] 3/200
[1] 4/200
[1] 5/200
[1] 6/200
[1] 7/200
[1] 8/200
[1] 9/200
[1] 10/200
 {
+ enres-enet(x=X,y=Y[,j],lambda=1,normalize=TRUE,intercept=TRUE)
+ fit-predict.enet(enres, X, type=coefficients)
+ store[,j]-fit$coefficients
+ }
*Error in store[, j] - fit$coefficients :
  number of items to replace is not a multiple of replacement length
 *

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Sparse PCA problem

2009-04-02 Thread Alex Roy
Dear R user,
  I want to do sparse principal component analysis
(spca). I am using elastic net package for this and spca() and the code is
following from the example.

My question is How can I decide the *K =? *and *para=c(7,4,4,1,1,1)) . So,
here k=6 i.e the no of Principal Components. and each pcs say , *
**
pc1 number of non zero loading is 7

pc2 number of non zero loading is 4

pc3 number of non zero loading is 4

pc4 number of non zero loading is 1

pc5 number of non zero loading is 1

pc6 number of non zero loading is 1

*How can I know in which pc,s how many non zero loadings will be? Any
code??? One answer can be cross validation but I did not find in the
package. *
**
*Thanks for your help*

*Code:*

library(elasticnet)
 out2-spca(pitprops,*K=6*,type=Gram,sparse=varnum,trace=TRUE,*
para=c(7,4,4,1,1,1))
*iterations 10
iterations 20
iterations 30
iterations 40
 out2
Call:
spca(x = pitprops, K = 6, para = c(7, 4, 4, 1, 1, 1), type = Gram,
sparse = varnum, trace = TRUE)
6 sparse PCs
Pct. of exp. var. : 28.2 13.9 13.1  7.4  6.8  6.3
Num. of non-zero loadings :  7 4 4 1 1 1
Sparse loadings
   PC1PC2PC3 PC4 PC5 PC6
topdiam -0.477  0.003  0.000   0   0   0
length  -0.469  0.000  0.000   0   0   0
moist0.000  0.785  0.000   0   0   0
testsg   0.000  0.619  0.000   0   0   0
ovensg   0.180  0.000  0.656   0   0   0
ringtop  0.000  0.000  0.589   0   0   0
ringbut -0.290  0.000  0.470   0   0   0
bowmax  -0.343 -0.029 -0.048   0   0   0
bowdist -0.414  0.000  0.000   0   0   0
whorls  -0.383  0.000  0.000   0   0   0
clear0.000  0.000  0.000  -1   0   0
knots0.000  0.000  0.000   0  -1   0
diaknot  0.000  0.000  0.000   0   0   1


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Double Cross validation for LASSO

2009-03-17 Thread Alex Roy
Dear R user,
 I am looking for a code on double cross validation in
LASSO , one for optimizing the parameter and other one is for MSEP. If any
one have it, please foroward to me. I am using different  package like LARS,
chemometric etc.

Thanks in advance

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Singularity in a regression?

2009-02-27 Thread Alex Roy
If  collinearity exists, one of the solutions is regulazation version of
regression. There are different types of regularization method. like Ridge,
LASSO, elastic net etc. For example, in  MASS package you can get ridge
regression.

Alex


On Thu, Feb 26, 2009 at 1:58 PM, Bob Gotwals gotw...@ncssm.edu wrote:

 R friends,

 In a matrix of 1s and 0s, I'm getting a singularity error.  Any helpful
 ideas?

 lm(formula = activity ~ metaF + metaCl + metaBr + metaI + metaMe +
paraF + paraCl + paraBr + paraI + paraMe)

 Residuals:
   Min 1Q Median 3QMax
 -4.573e-01 -7.884e-02  3.469e-17  6.616e-02  2.427e-01

 Coefficients: (1 not defined because of singularities)
Estimate Std. Error t value Pr(|t|)
 (Intercept)   7.9173 0.1129  70.135   2e-16 ***
 metaF-0.3973 0.2339  -1.698 0.115172
 metaClNA NA  NA   NA
 metaBr0.3454 0.1149   3.007 0.010929 *
 metaI 0.4827 0.2339   2.063 0.061404 .
 metaMe0.3654 0.1149   3.181 0.007909 **
 paraF 0.7675 0.1449   5.298 0.000189 ***
 paraCl0.3400 0.1449   2.347 0.036925 *
 paraBr1.0200 0.1449   7.040 1.36e-05 ***
 paraI 1.3327 0.2339   5.697 9.96e-05 ***
 paraMe1.2191 0.1573   7.751 5.19e-06 ***
 ---
 Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

 Residual standard error: 0.2049 on 12 degrees of freedom
 Multiple R-squared: 0.9257, Adjusted R-squared: 0.8699
 F-statistic: 16.61 on 9 and 12 DF,  p-value: 1.811e-05

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Leave one out Cross validation (LOO)

2009-02-24 Thread Alex Roy
Dear R user,
   I am working with LOO. Can any one who is working
with leave one out cross validation (LOO) could send me the code?

Thanks in advance

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] e1071 package for SVM

2009-02-20 Thread Alex Roy
Dear all,
I got a code for e1071 package in R for SVM regression. I
have used *m$coefs*  for extracting the coefficients but I am getting only
72 .  How can I extract coefficients of the predictors   set? Does it mean
that I will get only 72  as *Number of Support Vectors:  72. *
**
Thanks in advance

Code:
--

library(e1071)
 # create data
 x - seq(0.1, 5, by = 0.05)
 y - log(x) + rnorm(x, sd = 0.2)
 # estimate model and predict input values
 m - svm(x, y)
 new - predict(m, x)
 m
Call:
svm.default(x = x, y = y)

Parameters:
   SVM-Type:  eps-regression
 SVM-Kernel:  radial
   cost:  1
  gamma:  1
epsilon:  0.1

*Number of Support Vectors:  72*
* m$coefs*
 new
   12345
6789   10   11
12
-1.327786564 -1.277059853 -1.221424097 -1.161313628 -1.097200621
-1.029588549 -0.959005127 -0.885994883 -0.81473 -0.734909901
-0.657938792 -0.580732849
  13   14   15   16   17
18   19   20   21   22
23   24
-0.503805655 -0.427642943 -0.352696474 -0.279378612 -0.208057720
-0.139054438 -0.072638906 -0.009028968  0.051610615  0.109167970
0.163582945  0.214845022
  25   26   27   28   29
30   31   32   33   34
35   36
 0.262990384  0.308098215  0.350286330  0.389706259  0.426537887
0.460983809  0.493263495  0.523607428  0.552251329  0.579430594
0.605375062  0.630304214
  37   38   39   40   41
42   43   44   45   46
47   48
 0.654422894  0.677917633  0.700953619  0.723672382  0.746190188
0.768597174  0.790957190  0.813308346  0.835664193  0.858015504
0.880332573  0.902567958
  49   50   51   52   53
54   55   56   57   58
59   60
 0.924659570  0.946534037  0.968110220  0.989302798  1.010025830
1.030196181  1.049736754  1.068579418  1.086667584  1.103958361
1.120424239  1.136054277
  61   62   63   64   65
66   67   68   69   70
71   72
 1.150854762  1.164849331  1.178078576  1.190599126  1.202482259
1.213812069  1.224683251  1.235198555  1.245465988  1.255595825
1.265697515  1.275876563
  73   74   75   76   77
78   79   80   81   82
83   84
 1.286231447  1.296850678  1.307810045  1.319170135  1.330974176
1.343246256  1.355989966  1.369187496  1.382799201  1.396763660
1.410998202  1.425399923
  85   86   87   88   89
90   91   92   93   94
95   96
 1.439847127  1.454201198  1.468308834  1.482004603  1.495113765
1.507455297  1.518845050  1.529098984  1.538036397  1.545483098
1.551274451  1.555258224
  97   98   99
 1.557297215  1.557271572  1.555080800

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] SVM regression code

2009-02-19 Thread Alex Roy
Dear R user,
I am looking for SVM regression in R. It willl be
helpful for me if some one send me SVM regression code.
Thanks

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Subset Regression Package

2009-02-18 Thread Alex Roy
Thank you very much for your help

Alex

On Wed, Feb 18, 2009 at 1:26 PM, Pedro Silva psi...@porto.ucp.pt wrote:

 --

 Message: 72
 Date: Tue, 17 Feb 2009 22:05:46 퍍 (UTC)
 From: Hans W. Borchers hwborch...@gmail.com
 Subject: Re: [R] Subset Regression Package
 To: r-h...@stat.math.ethz.ch
 Message-ID: loom.20090217t215556-...@post.gmane.org
 Content-Type: text/plain; charset=us-ascii

 Take also a look at the subselect package that can perform subset
 selection in regression (and in several other statistical problems)
 using both exact (leaps and bounds algorithm) and heuristic
 (simulated annealing, genetic search, etc.) methods.

 Regards,

 A. Pedro Duarte Silva


 Alex Roy alexroy2008 at gmail.com writes:

 
  Dear all ,
Is there any subset regression (subset selection
  regression) package in R other than leaps?


 Lars and Lasso are other 'subset selection' methods, see the corresponding
 packages 'lars' and 'lasso2' and its description in The Elements of
 Statistical
 Learning.
 Also, 'dr', Methods for dimension reduction for regression, or
  'relaimpo',
 Relative importance of regressors in linear models, can be considered.


  Thanks and regards
 
  Alex
 

 ***

 Esta mensagem (incluindo quaisquer anexos) pode conter informa豫o
 confidencial ou legalmente protegida para uso exclusivo do destinat�io. Se
 n� for o destinat�io pretendido da mesma, n� dever�fazer uso, copiar,
 distribuir ou revelar o seu conte�o (incluindo quaisquer anexos) a
 terceiros, sem a devida autoriza豫o. Se recebeu esta mensagem por engano, por
 favor informe o emissor, por e-mail, e elimine-a imediatamente. Obrigado.


 This message may contain confidential information or pri...{{dropped:6}}


 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.htmlhttp://www.r-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Subset Regression Package

2009-02-17 Thread Alex Roy
Dear all ,
  Is there any subset regression (subset selection
regression) package in R other than leaps?


Thanks and regards

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] PRESS / RMSEP

2009-02-15 Thread Alex Roy
Dear all ,
 I want to do PRESS (prediction error sums of squares)
or the residual mean square error of prediction (RMSEP) which will give me
 value that is valid for 'future predictions of independent data'. I am
using different methods for example, Multiple Linear Regression, LASSO
regression, Ridge Regression, Elastic Net regression etc.
I am wandering if there are some package(s) in R or some
websites/materials for such type of method.

Thanks and regards

Alex

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.