Hi all. I have an issue that I cannot resolve. I am trying to read in lots of 
data that are stored in xml files. But after I read them in and copy the 
relevant data, then remove the document etc, it doesn't free up the memory. 
When I monitor it in windows task manager the memory usage just climbs with 
each iteration until R crashes. I can replicate the problem with the small 
example:
        file.name<-"C:\\MyData.xml.gz"
        TEMPP<-xmlParse(file.name)
        xx <- xmlRoot(TEMPP)
        rm(xx)
        rm(TEMPP)
        gc()

Even though I remove the root node xx and the document TEMPP, the memory usage 
remains the same as it was when I first read it in... Any ideas/solutions?
I am using a 32bit version of R 2.14.0 on windows XP, and the latest version of 
XML (3.6.1).
Many thanks
Andrew (apologies for the large footer my work appends to all my emails...)

Please consider the environment before printing this email
Warning:  This electronic message together with any attachments is 
confidential. If you receive it in error: (i) you must not read, use, disclose, 
copy or retain it; (ii) please contact the sender immediately by reply email 
and then delete the emails.
The views expressed in this email may not be those of Landcare Research New 
Zealand Limited. http://www.landcareresearch.co.nz

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to