Dear list,

when iterating over a set of Rdata files that are loaded, analyzed and 
then removed from memory again, I experience a *significant* increase in 
an R process' memory consumption (killing the process eventually).

It just seems like removing the object via |rm()| and firing |gc()| do 
not have any effect, so the memory consumption of each loaded R object 
cumulates until there's no more memory left :-/

Possibly, this is also related to XML package functionality (mainly 
|htmlTreeParse| and |getNodeSet|), but I also experience the described 
behavior when simply iteratively loading and removing Rdata files.

I've put together a little example that illustrates the memory 
ballooning mentioned above which you can find here: 
http://stackoverflow.com/questions/9220849/significant-memory-issue-in-r-when-iteratively-loading-rdata-files-killing-the

Is this a bug? Any chance of working around this?

Thanks a lot and best regards,
Janko



        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to