Re: [R] Memory issues

2006-09-03 Thread Prof Brian Ripley
Please do read the rw-FAQ, Q2.9  (and the posting guide).

In particular, Windows never gives 4GB to a single 32-bit user process.

On Sun, 3 Sep 2006, Davendra Sohal wrote:

> Hi,
> I'm using R on Windows and upgraded the computer memory to 4GB, as R was
> telling me that it is out of memory (for making heatmaps).
> It still says that the maximum memory is 1024Mb, even if I increase it using
> memory.limit and memory.size.
> Is there a way to permanently increase R's memory quota to 4GB?
> Please help.
> Many thanks,
> -DS.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
> 

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory issues with large data set

2005-09-29 Thread roger bos
memory.limit may not be the correct command. I use the command 'utils::
memory.size(3*1024)' to increase my memory size after using editbin to
modify the header of R to make it LARGEADDRESSAWARE as described in the
above FAQ. I am able to read about 2.7Gb into memory that way with 4Gb of
ram. Not only am I able to read it into memory, but I can do regessions on
subsets of the data no problem.
 My question has always been, why can't R ship LARGEADDRESSAWARE for those
users who may not have access to 'editbin' type tools?
 Thanks,
 Roger


 On 9/28/05, Christina Yau <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> I am running R 2.0.1.1 . on Windows. It is a Dell
> Dimension with a 3.2 Ghz Processor and 4Gb RAM.
>
> When using the ReadAffy() function to read in 97 arrays, I get the below
> error messages:
> Error: cannot allocate vector of size 393529
> Reached total allocation of 1024Mb: see help(memory.size)
>
> When I use the comman "memory.limit(size=4000)" to increase the memory
> size to the maximum available, I got a "NULL" as a response.
>
> I proceeded to re-run ReadAffy(). This time, I only get the first error
> message.
> Error: cannot allocate vector of size 393529
>
> >From what I've read, this is more of a problem with Windows than with R.
> But I am wondering if there is anything I can do, either with the set up of
> R or Windows, to solve this problem and read the data set into R using this
> machine.
>
> Thank you for your attention,
> Christina
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] memory issues with large data set

2005-09-28 Thread James W. MacDonald
Christina Yau wrote:
> Hi,
> 
> I am running R 2.0.1.1. on Windows.  It is a Dell Dimension with a
> 3.2 Ghz Processor and 4Gb RAM.

This question concerns a BioC package, so the correct listserv is 
[EMAIL PROTECTED], not the R-help listserv. In the future, 
you should direct questions about BioC packages there.

You don't have enough memory to read all 97 arrays into an AffyBatch, 
not to mention doing any further processing on them. You will have to 
use justRMA() or justGCRMA() to process your data.

In addition, I don't think you can access any more than 2 Gb of RAM 
anyway without making some changes. See 2.11 of the Windows FAQ.

HTH,

Jim


> 
> When using the ReadAffy() function to read in 97 arrays, I get the
> below error messages: Error: cannot allocate vector of size 393529 
> Reached total allocation of 1024Mb: see help(memory.size)
> 
> When I use the comman "memory.limit(size=4000)" to increase the
> memory size to the maximum available, I got a "NULL" as a response.
> 
> I proceeded to re-run ReadAffy().  This time, I only get the first
> error message. Error: cannot allocate vector of size 393529
> 
>> From what I've read, this is more of a problem with Windows than
>> with R.  But I am wondering if there is anything I can do, either
>> with the set up of R or Windows, to solve this problem and read the
>> data set into R using this machine.
> 
> Thank you for your attention, Christina
> 
> __ 
> R-help@stat.math.ethz.ch mailing list 
> https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the
> posting guide! http://www.R-project.org/posting-guide.html


-- 
James W. MacDonald
University of Michigan
Affymetrix and cDNA Microarray Core
1500 E Medical Center Drive
Ann Arbor MI 48109
734-647-5623



**
Electronic Mail is not secure, may not be read every day, and should not be 
used for urgent or sensitive issues.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Memory issues..

2003-11-21 Thread Barry Rowlingson
JFRI (Jesper Frickmann) wrote:
I just tried out the 1.8.1 beta build, and it works! It ran through all
17 assays without a any problems on Windows 2000.
Thanks to the R development team, they did a great job!

 As I always say at the end of a busking session, "Please, please dont 
applaud - just throw money":

http://www.r-project.org/foundation/donations.html

 I'm sure the R development team will be grateful (I'm not part of the 
R dev team, I'm just making a plug for them, and reminding myself I 
ought to get round to joining the Foundation).

Baz

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] Memory issues..

2003-11-21 Thread JFRI (Jesper Frickmann)
I just tried out the 1.8.1 beta build, and it works! It ran through all
17 assays without a any problems on Windows 2000.

Thanks to the R development team, they did a great job!

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460


-Original Message-
From: James MacDonald [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 12, 2003 1:09 PM
To: JFRI (Jesper Frickman); [EMAIL PROTECTED]; [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: RE: [R] Memory issues..


There was a discussion about memory allocation on the R-devel list this
summer, and apparently somebody has done something about it in R-1.8.1
(according to BDR's earlier post). If you can compile yourself on
windows, you could check it out yourself.

Original post http://maths.newcastle.edu.au/~rking/R/devel/03b/0432.html
BDR's  reply http://maths.newcastle.edu.au/~rking/R/devel/03b/0433.html

BDR's recent comment
"Hopefully the memory management in R-devel will ease this, 
and you might like to compile that up and try it."

HTH,

Jim





James W. MacDonald
Affymetrix and cDNA Microarray Core
University of Michigan Cancer Center
1500 E. Medical Center Drive
7410 CCGC
Ann Arbor MI 48109
734-647-5623

>>> Rodrigo Abt <[EMAIL PROTECTED]> 11/12/03 12:08PM >>>
I started R with --max-mem-size=300M and it "seems" to work better (at
least it doesn't hang up my machine), but I don't have any results yet.

P.S.: Are there any differences in memory management from 1.7.x to 1.8.0
?

Greetings,
Rodrigo Abt B.,
Statistical Analyst,
Department of Economic and Tributary Studies,
Studies Subdivision,
SII, Chile.

-Mensaje original-
De: Thomas W Blackwell [mailto:[EMAIL PROTECTED] 
Enviado el: Miercoles, 12 de Noviembre de 2003 12:43
Para: JFRI (Jesper Frickman)
CC: [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED] 
Asunto: RE: [R] Memory issues..


Jesper  -  (off-list)

Jim MacDonald reports seeing different memory-management behavior
between Windows and Linux operating systems on the same, dual boot
machine.  Unfortunately, this is happening at the operating system
level, so the R code cannot do anything about it.  I have cc'ed Jim on
this email, hoping that he will give more details to the entire list.
What operating systems (and versions of R) do you think Rodrigo and
Jesper are using ?

Specifically for Jesper's  AnalyzeAssay() function:  There is some
manipulation you can do using  formula()  or  as.formula()  that will
assign a local object as the environment in which to find values for the
terms in a formula.  (I've never done this, so I can't give you an
example of working code, only references to the help pages for "formula"
and "environment".  It's often very instructive to literally type in the
sequence of statements given as examples at the bottom of each help
page.)  I think this will allow you to avoid assigning to the global
workspace.

Are you sure that the call to  rm() below is actually removing the copy
of limsdata that's in .GlobalEnv, rather than a local copy ? I would
expect you to have to specify  where=1  in order to get the behavior you
want.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe
R
> has just used up the memory on something else. I think there is a
fair
> amount of memory leak, as I get similar problems with my program. I
use
> R 1.8.0. My program goes as follows.
>
> 1. Use RODBC to get a data.frame containing assays to analyze (17
assays
> are found).
> 2. Define an AnalyzeAssay(assay, suffix) function to do the
following:
>   a) Use RODBC to get data.
>   b) Store dataset "limsdata" in workspace using the <<- operator
to 
> avoid the following error in qqnorm.lme: Error in eval(expr,
envir,
> enclos) : Object "limsdata" not found, when I call it with a
grouping
> formula like: ~ resid(.) | ORDCURV.
>   c) Call lme to analyze data.
>   d) Produce some diagnostic plots. Record them by setting
record=TRUE 
> on the trellis.device
>   e) Save the plots on win.metafile using replayPlot(...)
>   f) Save text to a file using sink(...)
>
> 3. Call the function for each assay using the code:
>
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
>   writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> assays$PROFNO[i], "...", sep=""))
>   flush.console()
>   AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
>
>   # Clean up memory
>   rm(limsdata)
>   gc()
> }
>
> As you can see, I try to remove the dataset stored in workspace and
then
> ca

RE: [R] Memory issues..

2003-11-13 Thread Prof Brian Ripley
On Thu, 13 Nov 2003, JFRI (Jesper Frickman) wrote:

> I tried first to increase --min-vsize to 2G (which I assume means as
> much of the 512M RAM available on my system as possible). The idea was
> to allocate all the heap memory in one huge chunk to avoid
> fragmentation. 

But had you actually read the documentation you would know it did not do 
that.  That needs --max-memory-size set.

> It actually brought the number of assays completed up
> from 11 to 13 before it stopped with the usual error. Then I increased
> --max-memory-size to 2G, and when I came in this morning it was still
> running. However, it would probably take days instead of hours to
> complete the last couple of assays! So it is easier to restart a couple
> of times...
> 
> Do you think that running R on Linux would fix the problem? I use Linux
> on my private home PC, and I might get a permission to try it out on the
> company network... If I have a good reason to do so!

We don't know what the problem is, and you haven't AFAICS compiled up 
R-devel and tried that.

> Cheers,
> Jesper
> 
> -Original Message-
> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] 
> Sent: Wednesday, November 12, 2003 10:55 AM
> To: JFRI (Jesper Frickman)
> Cc: [EMAIL PROTECTED]
> Subject: RE: [R] Memory issues..
> 
> 
> On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:
> 
> > How much processing takes place before you get to the lme call? Maybe 
> > R has just used up the memory on something else. I think there is a 
> > fair amount of memory leak, as I get similar problems with my program.
> 
> > I use
> 
> Windows, right?  I don't think this is memory leak, but rather
> fragmentation.  Hopefully the memory management in R-devel will ease
> this, 
> and you might like to compile that up and try it.
> 
> On R 1.8.0 on Windows you have to be able to find a block of contiguous 
> memory of the needed size, so fragmentation can kill you.  Try
> increasing 
> --max-memory-size unless you are near 2Gb.
> 
> > R 1.8.0. My program goes as follows.
> > 
> > 1. Use RODBC to get a data.frame containing assays to analyze (17 
> > assays are found). 2. Define an AnalyzeAssay(assay, suffix) function 
> > to do the following:
> > a) Use RODBC to get data.
> > b) Store dataset "limsdata" in workspace using the <<- operator
> to 
> > avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> > enclos) : Object "limsdata" not found, when I call it with a grouping 
> > formula like: ~ resid(.) | ORDCURV.
> > c) Call lme to analyze data.
> > d) Produce some diagnostic plots. Record them by setting
> record=TRUE 
> > on the trellis.device
> > e) Save the plots on win.metafile using replayPlot(...)
> > f) Save text to a file using sink(...)
> > 
> > 3. Call the function for each assay using the code:
> > 
> > # Analyze each assay
> > for(i in 1:length(assays[,1]))
> > {
> > writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> > assays$PROFNO[i], "...", sep=""))
> > flush.console()
> > AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
> > 
> > # Clean up memory
> > rm(limsdata)
> > gc()
> > }
> > 
> > As you can see, I try to remove the dataset stored in workspace and 
> > then call gc() to clean up my memory as I go.
> > 
> > Nevertheless, when I come to assay 11 out of 17, it stops with a 
> > memory allocation error. I have to quit R, and start again with assay 
> > 11, then it stops again with assay 15 and finally 17. The last assays 
> > have much more data than the first ones, but all assays can be 
> > completed as long as I keep restarting...
> > 
> > Maybe restarting the job can help you getting it done?
> 
> 

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] Memory issues..

2003-11-13 Thread JFRI (Jesper Frickman)
I tried first to increase --min-vsize to 2G (which I assume means as
much of the 512M RAM available on my system as possible). The idea was
to allocate all the heap memory in one huge chunk to avoid
fragmentation. It actually brought the number of assays completed up
from 11 to 13 before it stopped with the usual error. Then I increased
--max-memory-size to 2G, and when I came in this morning it was still
running. However, it would probably take days instead of hours to
complete the last couple of assays! So it is easier to restart a couple
of times...

Do you think that running R on Linux would fix the problem? I use Linux
on my private home PC, and I might get a permission to try it out on the
company network... If I have a good reason to do so!

Cheers,
Jesper

-Original Message-
From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 12, 2003 10:55 AM
To: JFRI (Jesper Frickman)
Cc: [EMAIL PROTECTED]
Subject: RE: [R] Memory issues..


On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe 
> R has just used up the memory on something else. I think there is a 
> fair amount of memory leak, as I get similar problems with my program.

> I use

Windows, right?  I don't think this is memory leak, but rather
fragmentation.  Hopefully the memory management in R-devel will ease
this, 
and you might like to compile that up and try it.

On R 1.8.0 on Windows you have to be able to find a block of contiguous 
memory of the needed size, so fragmentation can kill you.  Try
increasing 
--max-memory-size unless you are near 2Gb.

> R 1.8.0. My program goes as follows.
> 
> 1. Use RODBC to get a data.frame containing assays to analyze (17 
> assays are found). 2. Define an AnalyzeAssay(assay, suffix) function 
> to do the following:
>   a) Use RODBC to get data.
>   b) Store dataset "limsdata" in workspace using the <<- operator
to 
> avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> enclos) : Object "limsdata" not found, when I call it with a grouping 
> formula like: ~ resid(.) | ORDCURV.
>   c) Call lme to analyze data.
>   d) Produce some diagnostic plots. Record them by setting
record=TRUE 
> on the trellis.device
>   e) Save the plots on win.metafile using replayPlot(...)
>   f) Save text to a file using sink(...)
> 
> 3. Call the function for each assay using the code:
> 
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
>   writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> assays$PROFNO[i], "...", sep=""))
>   flush.console()
>   AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
> 
>   # Clean up memory
>   rm(limsdata)
>   gc()
> }
> 
> As you can see, I try to remove the dataset stored in workspace and 
> then call gc() to clean up my memory as I go.
> 
> Nevertheless, when I come to assay 11 out of 17, it stops with a 
> memory allocation error. I have to quit R, and start again with assay 
> 11, then it stops again with assay 15 and finally 17. The last assays 
> have much more data than the first ones, but all assays can be 
> completed as long as I keep restarting...
> 
> Maybe restarting the job can help you getting it done?

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] Memory issues..

2003-11-12 Thread JFRI (Jesper Frickman)
I have just tried listing limsdata from the workspace and it is indeed
gone from .GlobalEnv. I also tried passing the environment to the
as.formula function, but it still doesn't work.

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460


-Original Message-
From: Thomas W Blackwell [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 12, 2003 10:43 AM
To: JFRI (Jesper Frickman)
Cc: [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: [R] Memory issues..


Jesper  -  (off-list)

Jim MacDonald reports seeing different memory-management behavior
between Windows and Linux operating systems on the same, dual boot
machine.  Unfortunately, this is happening at the operating system
level, so the R code cannot do anything about it.  I have cc'ed Jim on
this email, hoping that he will give more details to the entire list.
What operating systems (and versions of R) do you think Rodrigo and
Jesper are using ?

Specifically for Jesper's  AnalyzeAssay() function:  There is some
manipulation you can do using  formula()  or  as.formula()  that will
assign a local object as the environment in which to find values for the
terms in a formula.  (I've never done this, so I can't give you an
example of working code, only references to the help pages for "formula"
and "environment".  It's often very instructive to literally type in the
sequence of statements given as examples at the bottom of each help
page.)  I think this will allow you to avoid assigning to the global
workspace.

Are you sure that the call to  rm() below is actually removing the copy
of limsdata that's in .GlobalEnv, rather than a local copy ? I would
expect you to have to specify  where=1  in order to get the behavior you
want.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe 
> R has just used up the memory on something else. I think there is a 
> fair amount of memory leak, as I get similar problems with my program.

> I use R 1.8.0. My program goes as follows.
>
> 1. Use RODBC to get a data.frame containing assays to analyze (17 
> assays are found). 2. Define an AnalyzeAssay(assay, suffix) function 
> to do the following:
>   a) Use RODBC to get data.
>   b) Store dataset "limsdata" in workspace using the <<- operator
to 
> avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> enclos) : Object "limsdata" not found, when I call it with a grouping 
> formula like: ~ resid(.) | ORDCURV.
>   c) Call lme to analyze data.
>   d) Produce some diagnostic plots. Record them by setting
record=TRUE 
> on the trellis.device
>   e) Save the plots on win.metafile using replayPlot(...)
>   f) Save text to a file using sink(...)
>
> 3. Call the function for each assay using the code:
>
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
>   writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> assays$PROFNO[i], "...", sep=""))
>   flush.console()
>   AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
>
>   # Clean up memory
>   rm(limsdata)
>   gc()
> }
>
> As you can see, I try to remove the dataset stored in workspace and 
> then call gc() to clean up my memory as I go.
>
> Nevertheless, when I come to assay 11 out of 17, it stops with a 
> memory allocation error. I have to quit R, and start again with assay 
> 11, then it stops again with assay 15 and finally 17. The last assays 
> have much more data than the first ones, but all assays can be 
> completed as long as I keep restarting...
>
> Maybe restarting the job can help you getting it done?
>
> Cheers,
> Jesper
>
> -Original Message-
> From: Rodrigo Abt [mailto:[EMAIL PROTECTED]
> Sent: Monday, November 10, 2003 11:02 AM
> To: [EMAIL PROTECTED]
> Subject: [R] Memory issues..
>
>
> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. 
> My sample size is about 2965 and 3 factors:
>
> year (5 levels), ssize (4 levels), condition (2 levels).
>
> When I issue the following command:
>
> >
> lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,
> me
> thod
> ="ML")
>
> I got the following error:
>
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
> Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
>
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb 
> processor. My version o

RE: [R] Memory issues..

2003-11-12 Thread Prof Brian Ripley
On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe R
> has just used up the memory on something else. I think there is a fair
> amount of memory leak, as I get similar problems with my program. I use

Windows, right?  I don't think this is memory leak, but rather
fragmentation.  Hopefully the memory management in R-devel will ease this, 
and you might like to compile that up and try it.

On R 1.8.0 on Windows you have to be able to find a block of contiguous 
memory of the needed size, so fragmentation can kill you.  Try increasing 
--max-memory-size unless you are near 2Gb.

> R 1.8.0. My program goes as follows.
> 
> 1. Use RODBC to get a data.frame containing assays to analyze (17 assays
> are found).
> 2. Define an AnalyzeAssay(assay, suffix) function to do the following:
>   a) Use RODBC to get data.
>   b) Store dataset "limsdata" in workspace using the <<- operator
> to avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> enclos) : Object "limsdata" not found, when I call it with a grouping
> formula like: ~ resid(.) | ORDCURV.
>   c) Call lme to analyze data.
>   d) Produce some diagnostic plots. Record them by setting
> record=TRUE on the trellis.device
>   e) Save the plots on win.metafile using replayPlot(...)
>   f) Save text to a file using sink(...)
> 
> 3. Call the function for each assay using the code:
> 
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
>   writeLines(paste("Analyzing ", assays$DILUTION[i], " ",
> assays$PROFNO[i], "...", sep=""))
>   flush.console()
>   AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
> 
>   # Clean up memory
>   rm(limsdata)
>   gc()
> }
> 
> As you can see, I try to remove the dataset stored in workspace and then
> call gc() to clean up my memory as I go.
> 
> Nevertheless, when I come to assay 11 out of 17, it stops with a memory
> allocation error. I have to quit R, and start again with assay 11, then
> it stops again with assay 15 and finally 17. The last assays have much
> more data than the first ones, but all assays can be completed as long
> as I keep restarting...
> 
> Maybe restarting the job can help you getting it done?

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] Memory issues..

2003-11-12 Thread JFRI (Jesper Frickman)
I am using Windows 2000.

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460

-Original Message-
From: Thomas W Blackwell [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, November 12, 2003 10:43 AM
To: JFRI (Jesper Frickman)
Cc: [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: RE: [R] Memory issues..


Jesper  -  (off-list)

Jim MacDonald reports seeing different memory-management behavior
between Windows and Linux operating systems on the same, dual boot
machine.  Unfortunately, this is happening at the operating system
level, so the R code cannot do anything about it.  I have cc'ed Jim on
this email, hoping that he will give more details to the entire list.
What operating systems (and versions of R) do you think Rodrigo and
Jesper are using ?

Specifically for Jesper's  AnalyzeAssay() function:  There is some
manipulation you can do using  formula()  or  as.formula()  that will
assign a local object as the environment in which to find values for the
terms in a formula.  (I've never done this, so I can't give you an
example of working code, only references to the help pages for "formula"
and "environment".  It's often very instructive to literally type in the
sequence of statements given as examples at the bottom of each help
page.)  I think this will allow you to avoid assigning to the global
workspace.

Are you sure that the call to  rm() below is actually removing the copy
of limsdata that's in .GlobalEnv, rather than a local copy ? I would
expect you to have to specify  where=1  in order to get the behavior you
want.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe 
> R has just used up the memory on something else. I think there is a 
> fair amount of memory leak, as I get similar problems with my program.

> I use R 1.8.0. My program goes as follows.
>
> 1. Use RODBC to get a data.frame containing assays to analyze (17 
> assays are found). 2. Define an AnalyzeAssay(assay, suffix) function 
> to do the following:
>   a) Use RODBC to get data.
>   b) Store dataset "limsdata" in workspace using the <<- operator
to 
> avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> enclos) : Object "limsdata" not found, when I call it with a grouping 
> formula like: ~ resid(.) | ORDCURV.
>   c) Call lme to analyze data.
>   d) Produce some diagnostic plots. Record them by setting
record=TRUE 
> on the trellis.device
>   e) Save the plots on win.metafile using replayPlot(...)
>   f) Save text to a file using sink(...)
>
> 3. Call the function for each assay using the code:
>
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
>   writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> assays$PROFNO[i], "...", sep=""))
>   flush.console()
>   AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
>
>   # Clean up memory
>   rm(limsdata)
>   gc()
> }
>
> As you can see, I try to remove the dataset stored in workspace and 
> then call gc() to clean up my memory as I go.
>
> Nevertheless, when I come to assay 11 out of 17, it stops with a 
> memory allocation error. I have to quit R, and start again with assay 
> 11, then it stops again with assay 15 and finally 17. The last assays 
> have much more data than the first ones, but all assays can be 
> completed as long as I keep restarting...
>
> Maybe restarting the job can help you getting it done?
>
> Cheers,
> Jesper
>
> -Original Message-
> From: Rodrigo Abt [mailto:[EMAIL PROTECTED]
> Sent: Monday, November 10, 2003 11:02 AM
> To: [EMAIL PROTECTED]
> Subject: [R] Memory issues..
>
>
> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. 
> My sample size is about 2965 and 3 factors:
>
> year (5 levels), ssize (4 levels), condition (2 levels).
>
> When I issue the following command:
>
> >
> lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,
> me
> thod
> ="ML")
>
> I got the following error:
>
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
> Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
>
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb 
> processor. My version of R is 1.7.1.
>
> Thanks in advance,
>
> Rodrigo Abt.
> Department of Economic and Tributary Studies,
> SII, Chile.
>
> __
> [EMAIL PROTECTED] mailing list 
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] Memory issues..

2003-11-12 Thread Thomas W Blackwell
Jesper  -  (off-list)

Jim MacDonald reports seeing different memory-management behavior
between Windows and Linux operating systems on the same, dual boot
machine.  Unfortunately, this is happening at the operating system
level, so the R code cannot do anything about it.  I have cc'ed
Jim on this email, hoping that he will give more details to the
entire list.  What operating systems (and versions of R) do you
think Rodrigo and Jesper are using ?

Specifically for Jesper's  AnalyzeAssay() function:  There is some
manipulation you can do using  formula()  or  as.formula()  that will
assign a local object as the environment in which to find values for
the terms in a formula.  (I've never done this, so I can't give you
an example of working code, only references to the help pages for
"formula" and "environment".  It's often very instructive to literally
type in the sequence of statements given as examples at the bottom
of each help page.)  I think this will allow you to avoid assigning
to the global workspace.

Are you sure that the call to  rm() below is actually removing the
copy of limsdata that's in .GlobalEnv, rather than a local copy ?
I would expect you to have to specify  where=1  in order to get the
behavior you want.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe R
> has just used up the memory on something else. I think there is a fair
> amount of memory leak, as I get similar problems with my program. I use
> R 1.8.0. My program goes as follows.
>
> 1. Use RODBC to get a data.frame containing assays to analyze (17 assays
> are found).
> 2. Define an AnalyzeAssay(assay, suffix) function to do the following:
>   a) Use RODBC to get data.
>   b) Store dataset "limsdata" in workspace using the <<- operator
> to avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> enclos) : Object "limsdata" not found, when I call it with a grouping
> formula like: ~ resid(.) | ORDCURV.
>   c) Call lme to analyze data.
>   d) Produce some diagnostic plots. Record them by setting
> record=TRUE on the trellis.device
>   e) Save the plots on win.metafile using replayPlot(...)
>   f) Save text to a file using sink(...)
>
> 3. Call the function for each assay using the code:
>
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
>   writeLines(paste("Analyzing ", assays$DILUTION[i], " ",
> assays$PROFNO[i], "...", sep=""))
>   flush.console()
>   AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
>
>   # Clean up memory
>   rm(limsdata)
>   gc()
> }
>
> As you can see, I try to remove the dataset stored in workspace and then
> call gc() to clean up my memory as I go.
>
> Nevertheless, when I come to assay 11 out of 17, it stops with a memory
> allocation error. I have to quit R, and start again with assay 11, then
> it stops again with assay 15 and finally 17. The last assays have much
> more data than the first ones, but all assays can be completed as long
> as I keep restarting...
>
> Maybe restarting the job can help you getting it done?
>
> Cheers,
> Jesper
>
> -Original Message-
> From: Rodrigo Abt [mailto:[EMAIL PROTECTED]
> Sent: Monday, November 10, 2003 11:02 AM
> To: [EMAIL PROTECTED]
> Subject: [R] Memory issues..
>
>
> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My
> sample size is about 2965 and 3 factors:
>
> year (5 levels), ssize (4 levels), condition (2 levels).
>
> When I issue the following command:
>
> >
> lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,me
> thod
> ="ML")
>
> I got the following error:
>
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
> Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
>
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb
> processor. My version of R is 1.7.1.
>
> Thanks in advance,
>
> Rodrigo Abt.
> Department of Economic and Tributary Studies,
> SII, Chile.
>
> __
> [EMAIL PROTECTED] mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


RE: [R] Memory issues..

2003-11-12 Thread JFRI (Jesper Frickman)
How much processing takes place before you get to the lme call? Maybe R
has just used up the memory on something else. I think there is a fair
amount of memory leak, as I get similar problems with my program. I use
R 1.8.0. My program goes as follows.

1. Use RODBC to get a data.frame containing assays to analyze (17 assays
are found).
2. Define an AnalyzeAssay(assay, suffix) function to do the following:
a) Use RODBC to get data.
b) Store dataset "limsdata" in workspace using the <<- operator
to avoid the following error in qqnorm.lme: Error in eval(expr, envir,
enclos) : Object "limsdata" not found, when I call it with a grouping
formula like: ~ resid(.) | ORDCURV.
c) Call lme to analyze data.
d) Produce some diagnostic plots. Record them by setting
record=TRUE on the trellis.device
e) Save the plots on win.metafile using replayPlot(...)
f) Save text to a file using sink(...)

3. Call the function for each assay using the code:

# Analyze each assay
for(i in 1:length(assays[,1]))
{
writeLines(paste("Analyzing ", assays$DILUTION[i], " ",
assays$PROFNO[i], "...", sep=""))
flush.console()
AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])

# Clean up memory
rm(limsdata)
gc()
}

As you can see, I try to remove the dataset stored in workspace and then
call gc() to clean up my memory as I go.

Nevertheless, when I come to assay 11 out of 17, it stops with a memory
allocation error. I have to quit R, and start again with assay 11, then
it stops again with assay 15 and finally 17. The last assays have much
more data than the first ones, but all assays can be completed as long
as I keep restarting...

Maybe restarting the job can help you getting it done?

Cheers,
Jesper

-Original Message-
From: Rodrigo Abt [mailto:[EMAIL PROTECTED] 
Sent: Monday, November 10, 2003 11:02 AM
To: [EMAIL PROTECTED]
Subject: [R] Memory issues..


Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My
sample size is about 2965 and 3 factors:

year (5 levels), ssize (4 levels), condition (2 levels).

When I issue the following command:

>
lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,me
thod
="ML")

I got the following error:

Error in logLik.lmeStructInt(lmeSt, lmePars) :
Calloc could not allocate (65230 of 8) memory
In addition: Warning message:
Reached total allocation of 120Mb: see help(memory.size)

I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb
processor. My version of R is 1.7.1.

Thanks in advance,

Rodrigo Abt.
Department of Economic and Tributary Studies,
SII, Chile.

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Memory issues..

2003-11-10 Thread kjetil
On 10 Nov 2003 at 13:01, Rodrigo Abt wrote:

See?Memory
for how you can get R to use virtual memory.

Kjetil Halvorsen


> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My
> sample size is about 2965 and 3 factors:
> 
> year (5 levels), ssize (4 levels), condition (2 levels).
> 
> When I issue the following command:
> 
> >
> lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method
> ="ML")
> 
> I got the following error:
> 
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
> Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
> 
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor.
> My version of R is 1.7.1.
> 
> Thanks in advance,
> 
> Rodrigo Abt.
> Department of Economic and Tributary Studies,
> SII, Chile.
> 
> __
> [EMAIL PROTECTED] mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Memory issues..

2003-11-10 Thread Prof Brian Ripley
Have you done what the message said?

On Mon, 10 Nov 2003, Rodrigo Abt wrote:

> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My
> sample size is about 2965 and 3 factors:
> 
> year (5 levels), ssize (4 levels), condition (2 levels).
> 
> When I issue the following command:
> 
> >
> lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method
> ="ML")
> 
> I got the following error:
> 
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
> Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
> 
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor.
> My version of R is 1.7.1.

You probably need more memory, but you could try following the advice in 
the help page pointed to.  If you increase the memory allocation R will 
continue to run, albeit slowly.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help


Re: [R] Memory issues..

2003-11-10 Thread Roger D. Peng
The error says you don't have enough memory on your computer.   
Unfortunately, the only solution may be to buy more.

-roger

Rodrigo Abt wrote:

Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. My
sample size is about 2965 and 3 factors:
year (5 levels), ssize (4 levels), condition (2 levels).

When I issue the following command:

 

lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,method
="ML")
I got the following error:

Error in logLik.lmeStructInt(lmeSt, lmePars) :
   Calloc could not allocate (65230 of 8) memory
In addition: Warning message:
Reached total allocation of 120Mb: see help(memory.size)
I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb processor.
My version of R is 1.7.1.
Thanks in advance,

Rodrigo Abt.
Department of Economic and Tributary Studies,
SII, Chile.
__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
 

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help