> From: Hu Chen > > Hi all > Two computers: > one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap > (Virtual memory) 384Mb. When I allocate a large matrix, it firstly > uses up RAM, then use swap space. In windows' task manager, the usage > of memory could exceed my physic RAM's size. > The other machine is a remote server. Windows XP, R 1.9.1 > Physical RAM 2GB. > Swap space 4GB. I use "R --max-mem-size=4000M" to start R. However > when I allocate a large matrix or data frame, it uses up all RAM then > exits with a error message" cannot allocate vector of size 7812 Kb ". > The Swap space is not used at all !
Please do read http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-l imit-on-the-memory-it-uses_0021. > What's more, I found that the read.table() function is really > a waste of memory. > > ft <- read.table("filepath") > > object.size(ft) > object.size(ft) > [1] 192000692 > only 192Mb. > however, in the windows task manager it shows that this process takes > nearly 800Mb memory. > I used gc() to collect garbarge. Howerver it doesn't help. > Any guys have methods to release the wasted memory? This has been asked and answered on R-help many times (good candidate to add to the FAQ?). As an example, see: http://tolstoy.newcastle.edu.au/R/help/04/06/1662.html Andy > thank you all. > Regards > > ______________________________________________ > [EMAIL PROTECTED] mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide! > http://www.R-project.org/posting-guide.html > > ______________________________________________ [EMAIL PROTECTED] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html