On Sat, 26 Aug 2006, Klaus Thul wrote:
Dear all,
I have the following problem:
- I have written a program in R which runs out of physical memory
on my MacBook Pro with 1.5 GB RAM
How does R know about physical memory on a virtual-memory OS? I presume
the symptom is swapping by your OS, but how do you attribute that to R?
- The memory usage is much higher then I would expect from the
actual data in my global environment
- When I save the workspace, quit R, restart R and load the
workspace again, memory usage is much less then before
(~50 MB instead of ~1.5 GB), although nothing seems to be missing.
- It doesn't seem to be possible to reduce the amount of memory
used by calling gc()
- the behavior is independent of if I use R in command line or from
GUI, so I can exclude a problem of the Mac GUI
Any suggestions what the problem might be or how I could debug this?
How are you measuring memory usage?
If this is by gc(), note that lazyloading is one-way, and memory increases
after you use a lazyloaded object in a session. This can be dramatic
where datasets are involved.
If you are asking the OS about memory usage, return of memory to the OS is
a rather haphazard and OS-specific issue. On some OSes virtual memory is
not actually reclaimed from applications until it is needed (or ever).
It is certainly possible that something you are using has a memory leak,
but valgrind has been used to plug those on the most-used parts of R.
I at least need more precise indications of what you observed to be able
to comment more.
--
Brian D. Ripley, [EMAIL PROTECTED]
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax: +44 1865 272595
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.