Duncan Murdoch wrote:
On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory
running a 64bit OS and I have compiled R from source with the
following config
./configure --prefix=/usr/local/R-2.9.1
Hello Everyone!
Thanks for all your replies! This was very helpful! I found that
there seems to be a limitation to only 32GB of memory which I think will
be fine. I was able to consume the 32GB of memory with the following:
Start R with the following command: R --max-vsize 55000M
then
Seems strange. I can go all the way up to 50GB on our machine which
has 64GB as well. It starts swapping after that, so I killed the
process.
try this:
ans - list()
for(i in 1:100) {
ans[[ i ]] - numeric(2^30/2)
cat(iteration: ,i,\n)
print(gc())
}
source(scripts/test.memory.r)
Hey Whit,
That worked! I was able to consume all the memory on the server!
Thanks!
-scz
Whit Armstrong wrote:
Seems strange. I can go all the way up to 50GB on our machine which
has 64GB as well. It starts swapping after that, so I killed the
process.
try this:
ans - list()
for(i
Hello Everyone,
We have recently purchased a server which has 64GB of memory running
a 64bit OS and I have compiled R from source with the following config
./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
--enable-BLAS-shlib --enable-shared --with-readline --with-iconv
--with-x
You could probably just make a big array and watch top usage -- a 5gb
array would do the trick -- if you can break 4gb you are golden.
big_vector=c(1:100) and keep adding zeroes...
--j
Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory
On Jul 6, 2009, at 4:42 PM, Jonathan Greenberg wrote:
You could probably just make a big array and watch top usage -- a
5gb array would do the trick -- if you can break 4gb you are golden.
big_vector=c(1:100) and keep adding zeroes...
Except the maximum size for a vector (and I wonder
On 7/6/2009 3:52 PM, Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory running
a 64bit OS and I have compiled R from source with the following config
./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
--enable-BLAS-shlib --enable-shared
Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory running
a 64bit OS and I have compiled R from source with the following config
./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
--enable-BLAS-shlib --enable-shared --with-readline
On Jul 6, 2009, at 5:01 PM, David Winsemius wrote:
On Jul 6, 2009, at 4:42 PM, Jonathan Greenberg wrote:
You could probably just make a big array and watch top usage -- a
5gb array would do the trick -- if you can break 4gb you are golden.
big_vector=c(1:100) and keep adding zeroes...
check Memory in R:
?Memory
--- On Mon, 7/6/09, Scott Zentz ze...@email.unc.edu wrote:
From: Scott Zentz ze...@email.unc.edu
Subject: [R] Testing memory limits in R??
To: r-help@r-project.org
Date: Monday, July 6, 2009, 3:52 PM
Hello Everyone,
We have recently purchased a server which has
On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory running
a 64bit OS and I have compiled R from source with the following config
./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
On Jul 6, 2009, at 8:39 PM, Duncan Murdoch wrote:
On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory
running a 64bit OS and I have compiled R from source with the
following config
./configure
On Jul 6, 2009, at 9:39 PM, Duncan Murdoch wrote:
On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
Scott Zentz wrote:
Hello Everyone,
We have recently purchased a server which has 64GB of memory
running a 64bit OS and I have compiled R from source with the
following config
./configure
14 matches
Mail list logo