hello,

i have something like:

out<-list()

for(i in 1:n){
 data<-gen(...) #fixed size data
 out[[i]]<- fun(data)
}

> object.size(out[[1]])
6824

In principle 1 GB should allow

n = 1024^3/6824 = 157347?

i have about 2GB are not taken by other processes. however, I can see the
memory shrinking quite rapidly on my system monitor and have to stop the
simulation after only n=300. why such a discrepancy? any remedy?

x86_64-pc-linux/RKWard/R2.8.0/ 4GB

thanks.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to