-
From: "eliza botto" [eliza_bo...@hotmail.com]
Date: 11/11/2014 02:35 PM
To: "r-help@r-project.org"
Subject: [R] R memory issues
Dear useRs,
I have this funny thing going on with me since morning. I am 32 bit window 7
system with 4 GB RAM(2.95 usable). I tried to run a c
you may try to increase virtual memory :
http://windows.microsoft.com/en-us/windows/change-virtual-memory-size#1TC=windows-7
-Original Message-
From: "eliza botto" [eliza_bo...@hotmail.com]
Date: 11/11/2014 02:35 PM
To: "r-help@r-project.org"
Subject: [R] R memory i
The short answer is "get a bigger computer or find a way to do the
computation using less memory".
Best,
Ists
On Nov 11, 2014 2:34 PM, "eliza botto" wrote:
> Dear useRs,
> I have this funny thing going on with me since morning. I am 32 bit window
> 7 system with 4 GB RAM(2.95 usable). I tried to
On 11.11.2014 20:32, eliza botto wrote:
Dear useRs,
I have this funny thing going on with me since morning. I am 32 bit window 7
system with 4 GB RAM(2.95 usable). I tried to run a code on it but when I tried
to convert dataframe to matrix by using the following code
mat<-matrix(as.numeric(un
Dear useRs,
I have this funny thing going on with me since morning. I am 32 bit window 7
system with 4 GB RAM(2.95 usable). I tried to run a code on it but when I tried
to convert dataframe to matrix by using the following code
mat<-matrix(as.numeric(unlist(SFI)),nrow=nrow(SFI))
*where SFI is my
Hi, I am using an Lenovo Thinkpad with Ubuntu and 5.5Gb of RAM.
I am running against a memory ceiling.
Upon starting R the following command executes, but
the system monitor tells me that R is now using 2.4 GB, and
gc() agrees with that:
> m=matrix(data=1,ncol=18e3,nrow=18e3)
> gc()
Dear Emiliano:
When they say to read the posting guide, mostly they mean read the
posting guide. But I'll tell you the short version.
1. Include a full runable R program that causes the trouble you are
concerned about. Include the data or a link to the data, usually the
smallest possible example
As a continuation to my original question, here is the massage that I get:
Error in glm.fit(x = structure(c(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, :
cannot allocate memory block of size 2.1 Gb
The model "glm.fit" is a logistic type (in the family of GLM) model. Maybe
this is not enough informa
Already then, thank you everyone. This information was extremly useful, and
I'll do a better job on the web next time.
On Sun, May 20, 2012 at 2:10 PM, Prof Brian Ripley wrote:
> On 20/05/2012 18:42, jim holtman wrote:
>
>> At the point in time that you get the error message, how big are the
>> o
On 20/05/2012 18:42, jim holtman wrote:
At the point in time that you get the error message, how big are the
objects that you have in memory? What does 'memory.size()' show as
being used? What does 'memory.limit()' show? Have you tried using
'gc()' periodically to do some garbage collection?
You are on a 64-bit machine, but are
you using 64-bit R?
Are you using memory intensive constructs
like those discussed in Circle 2 of
'The R Inferno'?
http://www.burns-stat.com/pages/Tutor/R_inferno.pdf
Pat
On 20/05/2012 17:09, Emiliano Zapata wrote:
-- Forwarded message --
F
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021
Have you read the documentation?
---
Jeff NewmillerThe . . Go Live...
DCN:
At the point in time that you get the error message, how big are the
objects that you have in memory? What does 'memory.size()' show as
being used? What does 'memory.limit()' show? Have you tried using
'gc()' periodically to do some garbage collection? It might be that
you memory is fragmented.
Try memory.limit(92000)
sent from my HTC
On May 21, 2012 1:27 AM, "Emiliano Zapata" wrote:
> -- Forwarded message --
> From: Emiliano Zapata
> Date: Sun, May 20, 2012 at 12:09 PM
> Subject:
> To: R-help@r-project.org
>
>
> Hi,
>
> I have a 64 bits machine (Windows) with a total
-- Forwarded message --
From: Emiliano Zapata
Date: Sun, May 20, 2012 at 12:09 PM
Subject:
To: R-help@r-project.org
Hi,
I have a 64 bits machine (Windows) with a total of 192GB of physical memory
(RAM), and total of 8 CPU. I wanted to ask how can I make R make use of all
the mem
p@r-project.org
Subject: [R] Memory issues
Hi,
I have read several threads about memory issues in R and I can't seem to
find a solution to my problem.
I am running a sort of LASSO regression on several subsets of a big
dataset.
For some subsets it works well, and for some bigger subsets it
Hi,
I have read several threads about memory issues in R and I can't seem to
find a solution to my problem.
I am running a sort of LASSO regression on several subsets of a big dataset.
For some subsets it works well, and for some bigger subsets it does not
work, with errors of type "cannot alloca
Hi useRs,
I use R within Eclipse via StatET, and it seems to me that some memory
intensive tasks fail due to this environment.
For example: I was trying to find the distance matrix of a matrix with
(1 rows and 500 columns), and it failed in StatET, whereas it
worked in vanilla R.
I'm using R
Just wanted to leave a note on this, after I got my new iMac (and
installed R64 from the AT&T site) -- quantreg did run, after topping out
at whopping 12GB of swap space (MacOS X, at least, should theoretically
have as much swap space as there is space on the HD -- it will
dynamically increase
I am trying to run an ANOVA with a within/between subjects design.
It is a 2 (within) x 3 x 3 x 5 design with 100 observations per cell.
When I am trying to run this, I get an error message saying that there is
insufficient memory, even after allocating max memory for my machine (4
gig).
We
my earlier comment is probably irrelevant since you are fitting only
one qss component and have no other covariates.
A word of warning though when you go back to this on your new machine
-- you are almost surely going to want to specify
a large lambda for the qss component in the rqss call.
Yep, its looking like a memory issue -- we have 6GB RAM and 1GB swap --
I did notice that the analysis takes far less memory (and runs) if I:
tahoe_rq <-
rqss(ltbmu_4_stemsha_30m_exp.img~ltbmu_eto_annual_mm.img,tau=.99,data=boundary_data)
(which I assume fits a line to the quantiles)
vs.
ta
On 24 June 2009 at 14:07, Jonathan Greenberg wrote:
| I installed R 2.9.0 from the Debian package manager on our amd64
| system that currently has 6GB of RAM -- my first question is whether
| this installation is a true 64-bit installation (should R have access to
| > 4GB of RAM?) I suspe
Jonathan,
Take a look at the output of sessionInfo(), it should say x86-64 if
you have a 64bit installation, or at least I think this is the case.
Regarding rqss(), my experience is that (usually) memory problems are
due to the fact that early on the processing there is
a call to model.mat
Rers:
I installed R 2.9.0 from the Debian package manager on our amd64
system that currently has 6GB of RAM -- my first question is whether
this installation is a true 64-bit installation (should R have access to
> 4GB of RAM?) I suspect so, because I was running an rqss() (package
quantr
If by "review" you mean read in summary information then sqldf
can do that using the sqlite database in two lines of code.
You don't have to install, set up or define the database at all. sqldf and the
underlying RSQLite will do all that for you. See example
6b on the home page:
http://code.goog
Others may have mentioned this, but you might try loading your data
in a small database like mysql and then pulling smaller portions of
your data in via a package like RMySQL or RODBC.
One approach might be to split the data file into smaller pieces
outside of R, then read the smaller pie
Neotropical bat risk assessments wrote:
>
>
>How do people deal with R and memory issues?
>I have tried using gc() to see how much memory is used at each step.
>Scanned Crawley R-Book and all other R books I have available and the
> FAQ
>on-line but no help really found.
>R
Then post the material that would make sense for Windows.
What _does_ memory.limits() return? This _was_ asked and you did not
answer.
How many other objects do you have in your workspace?
How big are they?
Jim Holtman offered this function that displays memory occupation by
object and total
On Sun, 26 Apr 2009 09:20:12 -0600 Neotropical bat risk assessments
wrote:
NBRA>
NBRA>How do people deal with R and memory issues?
NBRA>I have tried using gc() to see how much memory is used at each
NBRA> step. Scanned Crawley R-Book and all other R books I have
NBRA> available and the F
On Apr 26, 2009, at 11:20 AM, Neotropical bat risk assessments wrote:
How do people deal with R and memory issues?
They should read the R-FAQ and the Windows FAQ as you say you have.
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_002
How do people deal with R and memory issues?
I have tried using gc() to see how much memory is used at each step.
Scanned Crawley R-Book and all other R books I have available and the FAQ
on-line but no help really found.
Running WinXP Pro (32 bit) with 4 GB RAM.
One SATA drive p
Oddly enough, the variogram modelling is rather quick in Surfer, but
one cannot compute the standard errors. I restricted the search to the
approximate the range of the variogram model (2000m). I can get R to
compute with 12079 observations, but 13453 and I run into the gstat
error message.
I think any geostatistical program/R package would have trouble handling
12000 observations on a PC. The empirical variogram would be built with
the combinations of 12000 over 2 pairs, nearly 72 millions pairs, and
during kriging, if you didn't restrict the search neighbourhood,
interpolation w
I think the clue is that the message you quote comes from gstat, which
does not use R's memory allocator. It is gstat and not R that has failed
to allocate memory.
Try re-reading the help page for memory.size. 'max=T' does not indicate
the limit (that is the job of memory.limit()), but the ma
Hi all,
I've read the R for windows FAQ and am a little confused re:
memory.limit and memory.size
to start using R 2.6.2 on WinXP, 2GB RAM, I have the command line "sdi
--max-mem-size=2047M"
Once the Rgui is open, memory.limit() returns 2047, memory.size()
returns 11.315, and memory.size(max=T)
Hi all,
I've read the R for windows FAQ and am a little confused re:
memory.limit and memory.size
to start using R 2.6.2 on WinXP, 2GB RAM, I have the command line "sdi
--max-mem-size=2047M"
Once the Rgui is open, memory.limit() returns 2047, memory.size()
returns 11.315, and memory.size(max=T)
37 matches
Mail list logo