Hi,

I have read several threads about memory issues in R and I can't seem to
find a solution to my problem.

I am running a sort of LASSO regression on several subsets of a big dataset.
For some subsets it works well, and for some bigger subsets it does not
work, with errors of type "cannot allocate vector of size 1.6Gb". The error
occurs at this line of the code:

   example <- cv.glmnet(x=bigmatrix, y=price, nfolds=3)


It also depends on the number of variables that were included in
"bigmatrix".


I tried on R and R64 for both Mac and R for PC but recently went onto a
faster virtual machine on Linux thinking I would avoid any memory issues. It
was better but still had some limits, even though memory.limit indicates
"Inf".


Is there anyway to make this work or do I have to cut a few variables in the
matrix or take a smaller subset of data ?

I have read that R is looking for some contiguous bits of memory and that
maybe I should pre-allocate the matrix ? Any idea ?


Many thanks

Emmanuel

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to