Re: [R] Running out of memory when using lapply

2006-08-11 Thread Seth Falcon
Hi Kamila,

Kamila Naxerova [EMAIL PROTECTED] writes:

 Hi all!

 I'm afraid I programmed something totally non-sensical and inefficient, 
 but I can't figure out how to do it better.

 I have a list of ~ 40 000 characters. I want to take each element at a 
 time, map it to a large data frame with 
 hit=which(data.frame$column==elementFromList), then compute some 
 statistic on data.frame[hit,] and return a result that consists of 
 either 1) a list of integers or 2) a character.

 res=lapply(listof4,myfunction,dataframeToSearchIn)

 On a small scale, this works and returns something like

 str(res)
 [[1]]
 [1] UNIQUE
 [[2]]
 [1]   405   406   407 16351
 [[3]]
 [1] REMOVE
 [[4]]
 [1] REMOVE

 If I try this with the entire 40 000 character list, though, I get the 
 Reached total allocation of 1022Mb: see help(memory.size) error message.

 Can someone please give me a hint how to solve this problem correctly? 
 THANKS!

One thing you might try is not running the entire 40K list at once.
Perhaps try breaking it into 4 10K lists, running each, and combining
the results.  This may get you around the allocation problem.

Another thing would be to find a system with more RAM (also read the
FAQ regarding ways to make the most amount of RAM available to R on
Windows, is that where you are?).

+ seth

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Running out of memory when using lapply

2006-08-07 Thread Kamila Naxerova
Hi all!

I'm afraid I programmed something totally non-sensical and inefficient, 
but I can't figure out how to do it better.

I have a list of ~ 40 000 characters. I want to take each element at a 
time, map it to a large data frame with 
hit=which(data.frame$column==elementFromList), then compute some 
statistic on data.frame[hit,] and return a result that consists of 
either 1) a list of integers or 2) a character.

res=lapply(listof4,myfunction,dataframeToSearchIn)

On a small scale, this works and returns something like

str(res)
[[1]]
[1] UNIQUE
[[2]]
[1]   405   406   407 16351
[[3]]
[1] REMOVE
[[4]]
[1] REMOVE

If I try this with the entire 40 000 character list, though, I get the 
Reached total allocation of 1022Mb: see help(memory.size) error message.

Can someone please give me a hint how to solve this problem correctly? 
THANKS!

Kamila

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] running out of memory while running a VERY LARGE regression

2005-11-23 Thread Prof Brian Ripley
On Tue, 22 Nov 2005, t c wrote:

 I am running a VERY LARGE regression (many factors, many rows of data) 
 using LM.

  I think I have my memory set as high as possible. ( I ran 
 memory.limit(size = 4000))

  Is there anything I can do?  ( FYI, I think I have removed all data I 
 am not using, and I think I have only the data needed for the 
 regression loaded.) Thanks.

Snce you mention memory.limit, I guess you are using Windows without 
telling us.  If so, have you set up the /3GB switch (see the rw-FAQ Q2.9) 
and modified the R executables?  (The modification is not necessary if you 
use the current R-patched available from CRAN.)

You will be able to save memory by using lm.fit rather than lm, perhaps 
running a session containing just the model matrix and the response.
(Unless of course you run out of memory forming the model matrix.)

The best answer is to use a 64-bit OS and a 64-bit build of R.

   [[alternative HTML version deleted]]

 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Please do as it asks and tell us your OS and do not send HTML mail and 
report the exact problem with the error messages.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] running out of memory while running a VERY LARGE regression

2005-11-22 Thread t c
I am running a VERY LARGE regression (many factors, many rows of data) using LM.
   
  I think I have my memory set as high as possible. ( I ran memory.limit(size 
= 4000))
   
  Is there anything I can do?  ( FYI, I think I have removed all data I am 
not using, and I think I have only the data needed for the regression 
loaded.) Thanks.


-

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] running out of memory

2005-02-17 Thread Thomas Schnhoff
Hello,

Am Mittwoch, 16. Februar 2005 20:48 schrieb Stephen Choularton:
 Hi

 I am trying to do a large glm and running into this message.

 Error: cannot allocate vector of size 3725426 Kb
 In addition: Warning message:
 Reached total allocation of 494Mb: see help(memory.size)

 Am I simply out of memory (I only  have .5 gig)?

 Is there something I can do?

This question has been answered a hundred times on this list. The best 
idea is trying to search for memory datasets in the lists mail 
archive  . Which gives you 246 hits!
Please give some more information on what system are you using (32, 64 
bit) etc.

Regards
Thomas

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] running out of memory

2005-02-16 Thread Stephen Choularton
Hi
 
I am trying to do a large glm and running into this message.  
 
Error: cannot allocate vector of size 3725426 Kb
In addition: Warning message: 
Reached total allocation of 494Mb: see help(memory.size)
 
Am I simply out of memory (I only  have .5 gig)?
 
Is there something I can do?
 
Stephen

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] running out of memory

2005-02-16 Thread Uwe Ligges
Stephen Choularton wrote:
Hi
 
I am trying to do a large glm and running into this message.  
 
Error: cannot allocate vector of size 3725426 Kb
In addition: Warning message: 
Reached total allocation of 494Mb: see help(memory.size)
 
Am I simply out of memory (I only  have .5 gig)?
 
Is there something I can do?
You have to rethink whether the analyses you are doing is sensible this 
way, or whether you can respecify things. R claims to need almost 4Gb(!) 
for the next memory allocation step, so you will get in trouble even on 
huge machines

 Uwe Ligges

Stephen
[[alternative HTML version deleted]]
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html