Greetings,

Just a follow up on this problem. I am not sure where the problem lies, but we 
think it is the users code and/or CRAN plugin that may be the cause. We have 
been getting pretty familiar with R recently and we can allocate and load large 
datasets into 10+GB of memory. One of our other users runs a program at the 
start of every week and claims he regularly gets 35+GB of memory (indeed, when 
we tested it on this week's data set it was just over 30GB). So it is clear 
that this problem is not a problem with R, the system, or any artificial limits 
that we can find.

So why is there a difference between one system and the other in terms of usage 
on what should be the exact same code? Well first off, I am not convinced it is 
the same dataset even though that is the claim (I don't have access to verify 
for various reasons). Second, he is using some libraries from the CRAN repos. 
We have already found an instance a few months ago where we had a bad compile 
that was behaving weird. I reran the compile for that library and it 
straightened out. I am wondering if this is the possibility again. The user is 
researching the library sets now.

In short, we don't have a solution yet to this explicit problem but at least I 
know for certain it isn't the system or R. Now that I can take a solid stance 
on those facts I have good ground to approach the user and politely say "Let's 
look at how we might be able to improve your code."

Thanks to everyone who helped me debug this issue. I do appreciate it.

Chris Stackpole 

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to