Thanks for replying!
i don't think i'm paging. i tried to use a smaller version of my matrix and do all the checkings as suggested by jim. The smaller matrix caused another problem, for which I've opened another thread. But i've found something about memory that I don't understand.
gc()
         used (Mb) gc trigger  (Mb) max used  (Mb)
Ncells  269577 14.4    5570995 297.6  8919855 476.4
Vcells 3353395 25.6    9493567  72.5 15666095 119.6

Does this mean the maximum memory I can use for variables is only 120 M?
However, when I tried to check the memory limits:
mem.limits()
nsize vsize
  NA    NA

Here it seems the maximum memory is not limited?

When there is no R function is being executed, I checked the system process by:
ps u

PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
7821  0.0  0.1  10048  2336 pts/0    Ss   Jul18   0:00 -bash
8076  2.9 24.5 523088 504004 pts/0   S+   Jul18   2:46 /usr/lib64/R/bi
8918  1.5  0.1   9912  2328 pts/1    Ss   00:44   0:00 -bash
8962  0.0  0.0   3808   868 pts/1    R+   00:45   0:00 ps u

Does this mean R is using 25% of my memory? But my RAM is 2 GB and the objects in R only occupy 40 MB from gc().

Did I interpret it wrong?

Thanks a lot!



From: "jim holtman" <[EMAIL PROTECTED]>
To: "zhihua li" <[EMAIL PROTECTED]>
CC: r-help@stat.math.ethz.ch
Subject: Re: [R] memory error with 64-bit R in linux
Date: Wed, 18 Jul 2007 17:50:31 -0500

Are you paging? That might explain the long run times. How much space
are your other objects taking up?  The matrix by itself should only
require about 13MB if it is numeric. I would guess it is some of the
other objects that you have in your working space.  Put some gc() in
your loop to see how much space is being used.  Run it with a subset
of the data and see how long it takes.  This might give you an
estimate of the time, and space, that might be needed for the entire
dataset.

Do a 'ps' to see how much memory your process is using. Do one every couple of minutes to see if it is growing. You can alway use Rprof()
to get an idea of where time is being spent (use it on a small
subset).

On 7/18/07, zhihua li <[EMAIL PROTECTED]> wrote:
Hi netters,

I'm using the 64-bit R-2.5.0 on a x86-64 cpu, with an RAM of 2 GB. The
operating system is SUSE 10.
The system information is:
-uname -a
Linux someone 2.6.13-15.15-smp #1 SMP Mon Feb 26 14:11:33 UTC 2007 x86_64
x86_64 x86_64 GNU/Linux

I used heatmap to process a matrix of the dim [16000,100]. After 3 hours
of desperating waiting, R told me:
cannot allocate vector of size 896 MB.

I know the matrix is very big, but since I have 2 GB of RAM and in a 64-bit system, there should be no problem to deal with a vector smaller than 1 GB?
(I was not running any other applications in my system)

Does anyone know what's going on? Is there a hardware limit where I have to add more RAM, or is there some way to resolve it softwarely? Also is it possible to speed up the computing (I don't wanna wait another 3 hours to
know I get another error message)

Thank you in advance!

_________________________________________________________________
享用世界上最大的电子邮件系统― MSN Hotmail。 http://www.hotmail.com


______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.




--
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to