[R] Out of memory crash on 64-bit Fedora

2009-03-27 Thread Adam Wilson
Greetings all,

First of all, thanks to all of you for creating such a useful, powerful
program.

I regularly work with very large datasets (several GB) in R in 64-bit Fedora
8 (details below).  I'm lucky to have 16GB RAM available.  However if I am
not careful and load too much into R's memory, I can crash the whole
system.  There does not seem to be a check in place that will stop R from
trying to allocate all available memory (including swap space).  I have
system status plots in my task bar, which I can watch to see when all the
ram is taken and R then reserves all the swap space. If I don't kill the R
process before the swap hits 100%, it will freeze the machine.  I don't know
if this is an R problem or a Fedora problem (I suppose the kernal should be
killing R before it crashes, but shouldn't R stop before it takes all the
memory?).

To replicate this behavior, I can crash the system by allocating more and
more memory in R:
v1=matrix(nrow=1e5,ncol=1e4)
v2=matrix(nrow=1e5,ncol=1e4)
v3=matrix(nrow=1e5,ncol=1e4)
v4=matrix(nrow=1e5,ncol=1e4)

etc. until R claims all RAM and swap space, and crashes the machine.  If I
try this on a windows machine eventually  the allocation fails with an error
in R,  Error: cannot allocate vector of size XX MB.  This is much
preferable to crashing the whole system.  Why doesn't this happen in Linux?

Is there some setting that will prevent this?  I've looked though the
archives and not found a similar problem.

Thanks for any help.

Adam



The facts:
 sessionInfo()
R version 2.8.0 (2008-10-20)
x86_64-redhat-linux-gnu

locale:
LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=C;LC_MESSAGES=en_US.UTF-8;LC_PAPER=en_US.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base
 version
   _
platform   x86_64-redhat-linux-gnu
arch   x86_64
os linux-gnu
system x86_64, linux-gnu
status
major  2
minor  8.0
year   2008
month  10
day20
svn rev46754
language   R
version.string R version 2.8.0 (2008-10-20)

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Out of memory crash on 64-bit Fedora

2009-03-27 Thread Peter Dalgaard

Adam Wilson wrote:

Greetings all,

First of all, thanks to all of you for creating such a useful, powerful
program.

I regularly work with very large datasets (several GB) in R in 64-bit Fedora
8 (details below).  I'm lucky to have 16GB RAM available.  However if I am
not careful and load too much into R's memory, I can crash the whole
system.  There does not seem to be a check in place that will stop R from
trying to allocate all available memory (including swap space).  I have
system status plots in my task bar, which I can watch to see when all the
ram is taken and R then reserves all the swap space. If I don't kill the R
process before the swap hits 100%, it will freeze the machine.  I don't know
if this is an R problem or a Fedora problem (I suppose the kernal should be
killing R before it crashes, but shouldn't R stop before it takes all the
memory?).

To replicate this behavior, I can crash the system by allocating more and
more memory in R:
v1=matrix(nrow=1e5,ncol=1e4)
v2=matrix(nrow=1e5,ncol=1e4)
v3=matrix(nrow=1e5,ncol=1e4)
v4=matrix(nrow=1e5,ncol=1e4)

etc. until R claims all RAM and swap space, and crashes the machine.  If I
try this on a windows machine eventually  the allocation fails with an error
in R,  Error: cannot allocate vector of size XX MB.  This is much
preferable to crashing the whole system.  Why doesn't this happen in Linux?

Is there some setting that will prevent this?  I've looked though the
archives and not found a similar problem.

Thanks for any help.

Adam


R won't know that it is running out of memory until the system refuses 
to allocate any more. However, this is not as clear-cut on Unix/Linux as 
on other OS's. Because some programs allocate a lot of memory without 
actually using it, the kernel may allow more memory to be allocated than 
is actually in the system, and then kill off processes if it finds 
itself tied in a knot. (Google linux oom for more details.)


You can however, put a hard limit on R's process size with the ulimit 
shell command before starting R (say, ulimit -v 1600 for a limit 
of 16G virtual memory). Notice if you play with this, that you can only 
lower the limits not raise them.



--
   O__   Peter Dalgaard Ă˜ster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
 (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
~~ - (p.dalga...@biostat.ku.dk)  FAX: (+45) 35327907

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.