On Fri, Dec 13, 2013, at 05:36 AM, Zé Loff wrote:
> Hi all
> 
> First of all, sorry for the kind on newbie question.
> I'm running some memory-heavy statistical analyses using R, which
> require more memory than what's physically available. I.e. the machine
> (a x201, which is running -current amd64) has 4Gb of physical mem, but R
> needs at least 6Gb. If I understand correctly, this is what virtual mem
> is there for, but -- and here's the newbie part -- I'm not quite sure on
> how to make it work.
[...]
> 3. vmemoryuse=3G + datasize-max=infinity
> Admittedly not knowing what I was doing. Big time SNAFU.
> Everything slows to a crawl when memory usage goes past the available
> phys mem (about 3.6G). And by a crawl I mean unusable if using X,
> requiring great patience if on virtual consoles.
> top shows R using over 1000% (not a typo) CPU although the CPU summary
> lines say they're all idling. "state" is "run/3", "wait" column says
> "pagerse". Swap usage increases, though. R never gets back to a usable
> state.
> 
> Clue bat required. Is there anything else that needs to be done to
> enable R to (properly) use some of the virtual memory?

I think R is using virtual memory as best it can, and I seriously doubt
you will get anything resembling satisfactory performance without
upgrading the RAM (memory) to 8Gb.

Basic computing terminology: "virtual (something X)" means "(something
X) that isn't really there." "Virtual memory" isn't really RAM (memory),
it's disk space. And you're going to get the performance of disk space,
which is orders of magnitude slower than RAM.

So: 1) segment this problem such that R never needs more than about 3Gb
of RAM in one run if possible, 2) upgrade the RAM, or 3) give R a very
long time to complete the task at hand and back up your hard disk
regularly because it will get a workout.

-- 
  Shawn K. Quinn
  skqu...@rushpost.com

Reply via email to