prem_R <mtechp...@gmail.com> writes:

> I'm running predictive analytics using R and to calibrate my model i
> used to adjust the variables used in the model and the problem happens
> here.R just runs out of memory .I tried garbage cleaning also.

I'm analyzing a 8 GB data set using R, so it can certainly handle large
data sets.  It tends to copy data very often, however, so you have to be
very careful with it.

For example, if you modify a single column in a data frame, R will copy
the entire data frame, rather than just replace the modified column.  If
you are running a regression that saves the input data in the model
result object, and you are modifying the data frame between runs, then
it would be very easy to have many copies of your data in memory at
once.

One solution would be not to keep the model result objects around.
Another would be to manually modify them to strip out the data object.
This can be tricky, however, since copies of the data may live on in the
environments of saved functions; I had this problem with 'mgcv::gam'
fits.

I hope that helps.

Regards,
Johann

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to