davew0000 wrote:
Hi all,

I'm running an analysis with the random forest tool. It's being applied to a
data matrix of ~60,000 rows and between about 40 and 200 columns. I get the
same error with all of the data files (Cannot allocate vector of size
428.5MB).
I found dozens of threads regarding this problem, but they never seem to be
concluded. Usually the OP is directed to the memory allocation help file
(which I haven't understood the solution for linux), and the last post is
the OP saying they haven't sorted out their problem yet.
I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with
lack of system resources. Can anyone tell me how I can get R to allocate larger vectors on Linux?


1. Check how much memory R used at the point the error message appeared. If it is round about 60 Gb, you know that it is lack of resources - for the given problem. If it is much less (around 2Gb), you might have a 32-bit R binary or you have some memory quota for your process.

Uwe Ligges



Many thanks,

Dave

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to