Greetings,
I am running R v 1.6.1 on a linux operating system with 2GB physical memory and dual
processors. When using BATCH to run a simulation with a super large dataset I can only
get through 1 1/2 repetitions before R uses all the memory and the simulation crashes.
It starts with 10% of the memory - up to 50% by the end of the first repetition and
then starts the second with approx 65% of the memory. I am using rm at the end of each
section and thus only keeping a small matrix of results. Any suggestions?
Thanks,
Annette
[[alternate HTML version deleted]]
______________________________________________
[EMAIL PROTECTED] mailing list
http://www.stat.math.ethz.ch/mailman/listinfo/r-help