Greetings,
Just a follow up on this problem. I am not sure where the problem lies, but we
think it is the users code and/or CRAN plugin that may be the cause. We have
been getting pretty familiar with R recently and we can allocate and load large
datasets into 10+GB of memory. One of our other
On Aug 16, 2013, at 10:19 AM, Stackpole, Chris wrote:
Greetings,
Just a follow up on this problem. I am not sure where the problem lies, but
we think it is the users code and/or CRAN plugin that may be the cause. We
have been getting pretty familiar with R recently and we can allocate
From: David Winsemius [mailto:dwinsem...@comcast.net]
Sent: Friday, August 16, 2013 12:59 PM
Subject: Re: [R] Memory limit on Linux?
[snip]
In short, we don't have a solution yet to this explicit problem
You may consider this to be an explicit problem but it doesn't read like
something
From: Kevin E. Thorpe [mailto:kevin.tho...@utoronto.ca]
Sent: Tuesday, August 13, 2013 2:25 PM
Subject: Re: [R] Memory limit on Linux?
It appears that at the shell level, the differences are not to blame.
It has been a long time, but years ago in HP-UX, we needed to change an
actual
From: Jack Challen [mailto:jack.chal...@ocsl.co.uk]
Sent: Wednesday, August 14, 2013 10:45 AM
Subject: RE: Memory limit on Linux?
(I'm replying from a horrific WebMail UI. I've attempted to maintain
what I think is sensible quoting. Hopefully it reads ok).
[snip]
If all users are able to
From: Kevin E. Thorpe [mailto:kevin.tho...@utoronto.ca]
Sent: Monday, August 12, 2013 11:00 AM
Subject: Re: [R] Memory limit on Linux?
What does ulimit -a report on both of these machines?
Greetings,
Sorry for the delay. Other fires demanded more attention...
For the system in which
On 08/13/2013 03:06 PM, Stackpole, Chris wrote:
From: Kevin E. Thorpe [mailto:kevin.tho...@utoronto.ca]
Sent: Monday, August 12, 2013 11:00 AM
Subject: Re: [R] Memory limit on Linux?
What does ulimit -a report on both of these machines?
Greetings,
Sorry for the delay. Other fires demanded
Greetings,
I have a user who is running an R program on two different Linux systems. For
the most part, they are very similar in terms of hardware and 64bit OS.
However, they perform significantly different. Under one box the program uses
upwards of 20GB of ram but fluctuates around 15GB of ram
On 08/12/2013 10:18 AM, Stackpole, Chris wrote:
Greetings, I have a user who is running an R program on two different
Linux systems. For the most part, they are very similar in terms of
hardware and 64bit OS. However, they perform significantly different.
Under one box the program uses upwards
On Aug 5, 2012, at 3:52 PM, alan.x.simp...@nab.com.au wrote:
Dear all
I have a Windows Server 2008 R2 Enterprise machine, with 64bit R
installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3
1066 Mhz
RAM. I am seeking to analyse very large data sets (perhaps as much
On 06.08.2012 09:34, David Winsemius wrote:
On Aug 5, 2012, at 3:52 PM, alan.x.simp...@nab.com.au wrote:
Dear all
I have a Windows Server 2008 R2 Enterprise machine, with 64bit R
installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066
Mhz
RAM. I am seeking to
On 06/08/2012 09:42, Uwe Ligges wrote:
On 06.08.2012 09:34, David Winsemius wrote:
On Aug 5, 2012, at 3:52 PM, alan.x.simp...@nab.com.au wrote:
Dear all
I have a Windows Server 2008 R2 Enterprise machine, with 64bit R
installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB
Alan,
More RAM will definitely help. But if you have an object needing more than
2^31-1 ~ 2 billion elements, you'll hit a wall regardless. This could be
particularly limiting for matrices. It is less limiting for data.frame
objects (where each column could be 2 billion elements). But many R
Dear all
I have a Windows Server 2008 R2 Enterprise machine, with 64bit R installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066 Mhz
RAM. I am seeking to analyse very large data sets (perhaps as much as
10GB), without the addtional coding overhead of a package such
Hi,
Before someone gives professional advice, you may do an experiment:
Set the windows virtual memeory to be as large as ~128GB, (make sure the
hard drive has enough space, restart might be required);
increase the memroy limit in R;
load a big dataset (or iteratively assign it to an object, and
Hi Peter,
Thanks for these information.
I used a column concatenating the listBy data to do this aggregation : (I
don't know if it's the best solution, but it seems to work).
aggregateMultiBy - function(x, by, FUN){
tableBy = data.frame(by)
tableBy$byKey =
for(colBy in
Dear all,
I am trying to aggregate a table (divided in two lists here), but get a
memory error.
Here is the code I'm running :
sessionInfo()
print(paste(memory.limit() , memory.limit()))
print(paste(memory.size() , memory.size()))
print(paste(memory.size(TRUE) ,
On Aug 2, 2011, at 11:45 , Guillaume wrote:
Dear all,
I am trying to aggregate a table (divided in two lists here), but get a
memory error.
Here is the code I'm running :
sessionInfo()
print(paste(memory.limit() , memory.limit()))
print(paste(memory.size() ,
Hi Peter,
Thanks for your answer.
I made a mistake in the script I copied sorry !
The description of the object : listX has 3 column, listBy has 4 column, and
they have 9000 rows :
print(paste(ncol x , length((listX
print(paste(ncol By , length((listBy
print(paste(nrow ,
On Aug 2, 2011, at 17:10 , Guillaume wrote:
Hi Peter,
Thanks for your answer.
I made a mistake in the script I copied sorry !
The description of the object : listX has 3 column, listBy has 4 column, and
So what is the contents of listBy? If they are all factors with 100 levels,
then
Hi Peter,
Yes I have a large number of factors in the listBy table.
Do you mean that aggregate() creates a complete cartesian product of the
by columns ? (and creates combinations of values that do not exist in the
orignial by table, before removing them when returning the aggregated
table?)
On Aug 2, 2011, at 19:09 , Guillaume wrote:
Hi Peter,
Yes I have a large number of factors in the listBy table.
Do you mean that aggregate() creates a complete cartesian product of the
by columns ? (and creates combinations of values that do not exist in the
orignial by table, before
)rdisk(0)partition(1)\WINDOWS=Microsoft Windows XP Professional
3GB /3GB /noexecute=optin /fastdetect
-Original Message-
From: r-help-boun...@r-project.org on behalf of Tim Clark
Sent: Tue 10/12/2010 5:49 AM
To: r help r-help
Cc: Tim Clark
Subject: [R] Memory limit problem
Dear List,
I am
Dear List,
I am trying to plot bathymetry contours around the Hawaiian Islands using the
package rgdal and PBSmapping. I have run into a memory limit when trying to
combine two fairly small objects using cbind(). I have increased the memory to
4GB, but am being told I can't allocate a vector
On Oct 11, 2010, at 11:49 PM, Tim Clark wrote:
Dear List,
I am trying to plot bathymetry contours around the Hawaiian Islands
using the
package rgdal and PBSmapping. I have run into a memory limit when
trying to
combine two fairly small objects using cbind(). I have increased
the
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
On Behalf Of David Winsemius
Sent: Monday, October 11, 2010 10:07 PM
To: Tim Clark
Cc: r help r-help
Subject: Re: [R] Memory limit problem
On Oct 11, 2010, at 11:49 PM, Tim Clark wrote
2010 3:00 PM
To: r-help@r-project.org
Subject: Re: [R] Memory limit problem
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
On Behalf Of David Winsemius
Sent: Monday, October 11, 2010 10:07 PM
To: Tim Clark
Cc: r help r-help
Subject: Re: [R
Hi,
when I try to import a microarray CEL batch, I get this error message:
myAB - ReadAffy ()
Error in .Call(read_abatch, filenames, rm.mask, rm.outliers, rm.extra, :
cannot allocate vector of length 1287151200
which, assuming the value is in bites, is below my RAM values (3 Gb recognized
On 08.05.2010 18:52, Zoppoli, Gabriele (NIH/NCI) [G] wrote:
Hi,
when I try to import a microarray CEL batch, I get this error message:
myAB- ReadAffy ()
This seems to be more related to the BioC mailing list.
Error in .Call(read_abatch, filenames, rm.mask, rm.outliers, rm.extra, :
On 05/08/2010 10:00 AM, Uwe Ligges wrote:
On 08.05.2010 18:52, Zoppoli, Gabriele (NIH/NCI) [G] wrote:
Hi,
when I try to import a microarray CEL batch, I get this error message:
myAB- ReadAffy ()
This seems to be more related to the BioC mailing list.
yes
Hi,
I have Win XP 32, 4 gig DDR2 and R 2.9.2.
I have memory limit problems.
memory.limit(4090)
[1] 4090
memory.limit()
[1] 4090
a-trans.matrix.f(7) # made big matrix of integer 16384*16384
Error: cannot allocate vector of size 512.0 Mb
I not have other objects in R memory.
what I do?
Hi,
On Thu, Sep 10, 2009 at 8:24 PM, oleg portnoyoleg.portno...@gmail.com wrote:
Hi,
I have Win XP 32, 4 gig DDR2 and R 2.9.2.
I have memory limit problems.
memory.limit(4090)
[1] 4090
memory.limit()
[1] 4090
a-trans.matrix.f(7) # made big matrix of integer 16384*16384
Error: cannot
Hi, all, I'm doing a discrete choice model in R and kept getting this error:
Error: cannot allocate vector of size 198.6 Mb.
Does this mean the memory limit in R has been reached?
memory.size()
[1] 1326.89
memory.size(TRUE)
[1] 1336
memory.limit()
[1] 1535
My laptop has a 4G memory with
Hongwei Dong wrote:
Hi, all, I'm doing a discrete choice model in R and kept getting this error:
Error: cannot allocate vector of size 198.6 Mb.
Does this mean the memory limit in R has been reached?
memory.size()
[1] 1326.89
memory.size(TRUE)
[1] 1336
memory.limit()
[1] 1535
My
The size of my .Rdata workspace is about 9.2 M and the data I'm using is the
only one object in this workspace. Is it a large one?
Thanks.
Harry
On Tue, Aug 18, 2009 at 4:21 AM, jim holtman jholt...@gmail.com wrote:
About 2GB is the limit of the address space on 32-bit windows (you can
get
do 'object.size' on all the objects in 'ls()'; also show the output of
'gc()'
Sent from my iPhone
On Aug 18, 2009, at 13:35, Hongwei Dong pdxd...@gmail.com wrote:
The size of my .Rdata workspace is about 9.2 M and the data I'm
using is the only one object in this workspace. Is it a large
Good afternoon,
The short answer is yes, the long answer is it depends.
It all depends on what you want to do with the data, I'm working with
dataframes of a couple of million lines, on this plain desktop machine and
for my purposes it works fine. I read in text files, manipulate them,
convert
I'm currently working with very large datasets that consist out of 1,000,000
+ rows. Is it at all possible to use R for datasets this size or should I
rather consider C++/Java.
--
View this message in context:
I routinely compute with a 2,500,000-row dataset with 16 columns,
which takes 410MB of storage; my Windows box has 4GB, which avoids
thrashing. As long as I'm careful not to compute and save multiple
copies of the entire data frame (because 32-bit Windows R is limited
to about 1.5GB address space
On Wed, Nov 26, 2008 at 1:16 PM, Stavros Macrakis [EMAIL PROTECTED] wrote:
I routinely compute with a 2,500,000-row dataset with 16 columns,
which takes 410MB of storage; my Windows box has 4GB, which avoids
thrashing. As long as I'm careful not to compute and save multiple
copies of the
40 matches
Mail list logo