> From: David Winsemius [mailto:dwinsem...@comcast.net] 
> Sent: Friday, August 16, 2013 12:59 PM
> Subject: Re: [R] Memory limit on Linux?
[snip] 
> > In short, we don't have a solution yet to this explicit problem
>
> You may consider this to be an "explicit problem" but it doesn't read like 
> something
> that is "explicit" to me. If you load an object that takes 10GB and then make 
> a
> modification to it, there will be 2 or three versions of it in memory, at 
> least until
> the garbage collector runs. Presumably your external *NIX methods of assessing
> memory use will fail to understand this fact of R-life.

Hrm. Maybe "explicit" was the wrong word. Maybe "specific" would have been a 
better choice. Sorry.

What I was trying to imply is that we can't replicate this exact same problem 
with anything else or in any other form but this users particular code/dataset. 
So the problem is very narrow in scope and related to the user code/dataset and 
therefore not to R in general. Where this odd behavior is coming from is still 
undetermined, I have at least narrowed the band of possibilities down 
significantly. 

Thanks!

Chris Stackpole

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to