Hi all, I'm running an analysis with the random forest tool. It's being applied to a data matrix of ~60,000 rows and between about 40 and 200 columns. I get the same error with all of the data files (Cannot allocate vector of size 428.5MB).
I found dozens of threads regarding this problem, but they never seem to be concluded. Usually the OP is directed to the memory allocation help file (which I haven't understood the solution for linux), and the last post is the OP saying they haven't sorted out their problem yet. I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with lack of system resources. Can anyone tell me how I can get R to allocate larger vectors on Linux? Many thanks, Dave -- View this message in context: http://www.nabble.com/Cannot-allocate-memory-of-size-x-on-Linux---what%27s-the-solution--tp25659271p25659271.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.