Hi,
 
I have 4 huge tables on which i want to do a PCA analysis and a kmean 
clustering. If i run each table individually i have no problems, but if i want 
to run it in a for loop i exceed the memory alocation after the second table, 
even if i save the results as a csv table and i clean up all the big objects 
with rm command. To me it seems that even if i don't have the objects anymore, 
the memory these objects used to occupy is not cleared. Is there any way to 
clear up the memory as well? I don't want to close R and start it up again. 
Also i am running R under Windows.
 
thanks,
 
Monica
_________________________________________________________________
[[trailing spam removed]]

        [[alternative HTML version deleted]]

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to