[R] Memory allocation problem

2008-08-12 Thread Jamie Ledingham
Dear R users,
I am running a large loop over about 400 files. To outline generally,
the code reads in the initial data file, then uses lookup text files to
obtain more information before connecting to a SQL database using RODBC
and extracting more data.  Finally all this is polar plotted.
My problem is that when the loop gets through 170 odd files it gives the
error message:
"Calloc could not allocate (263168 of 1) memory"
I have increased the memory using memory.limit to the maximum amount.
I strongly suspect that R is holding data temporarily and that this
becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.
Thanks
Jamie Ledingham

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2008-08-12 Thread Kerpel, John
See ?gc - it may help.

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Jamie Ledingham
Sent: Tuesday, August 12, 2008 9:16 AM
To: r-help@r-project.org
Subject: [R] Memory allocation problem

Dear R users,
I am running a large loop over about 400 files. To outline generally,
the code reads in the initial data file, then uses lookup text files to
obtain more information before connecting to a SQL database using RODBC
and extracting more data.  Finally all this is polar plotted.
My problem is that when the loop gets through 170 odd files it gives the
error message:
"Calloc could not allocate (263168 of 1) memory"
I have increased the memory using memory.limit to the maximum amount.
I strongly suspect that R is holding data temporarily and that this
becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.
Thanks
Jamie Ledingham

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem

2008-08-12 Thread Roland Rau

Jamie Ledingham wrote:

becomes too much to handle by the time the loop reaches 170.  Has anyone
had any experience of this problem before?  Is it possible to 'wipe' R's
memory at the end of each loop - all results are plotted and saved or
written to text file at the end of each loop so this may be the ideal
solution.


Besides using gc() (-> email by John Kerpel), you might also consider to 
remove all objects:

rm(list=ls())

I hope this helps,
Roland

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem (during kmeans)

2008-09-08 Thread rami batal
Dear all,

I am trying to apply kmeans clusterring on a data file (size is about 300
Mb)

I read this file using

x=read.table('file path' , sep=" ")

then i do kmeans(x,25)

but the process stops after two minutes with an error :

Error: cannot allocate vector of size 907.3 Mb

when i read the archive i notice that the best solution is to use a 64bit
OS.

"Error messages beginning cannot allocate vector of size indicate a failure
to obtain memory, either because the size exceeded the address-space limit
for a process or, more likely, because the system was unable to provide the
memory. Note that on a 32-bit OS there may well be enough free memory
available, but not a large enough contiguous block of address space into
which to map it. "

the problem that I have two machines with two OS (32bit and 64bit) and when
i used the 64bit OS the same error remains.

Thank you if you have any suggestions to me and excuse me because i am a
newbie.

Here the default information for the 64bit os:

> sessionInfo()
R version 2.7.1 (2008-06-23)
x86_64-redhat-linux-gnu

> gc()
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 137955  7.4 35 18.7   35 18.7
Vcells 141455  1.1 786432  6.0   601347  4.6

I tried also to start R using the options to control the available memory
and the result still the same. or maybe i don't assign the correct values.


Thank you in advance.

-- 
Rami BATAL

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory allocation problem (during kmeans)

2008-09-09 Thread Peter Dalgaard
rami batal skrev:
> Dear all,
>
> I am trying to apply kmeans clusterring on a data file (size is about 300
> Mb)
>
> I read this file using
>
> x=read.table('file path' , sep=" ")
>
> then i do kmeans(x,25)
>
> but the process stops after two minutes with an error :
>
> Error: cannot allocate vector of size 907.3 Mb
>
> when i read the archive i notice that the best solution is to use a 64bit
> OS.
>
> "Error messages beginning cannot allocate vector of size indicate a failure
> to obtain memory, either because the size exceeded the address-space limit
> for a process or, more likely, because the system was unable to provide the
> memory. Note that on a 32-bit OS there may well be enough free memory
> available, but not a large enough contiguous block of address space into
> which to map it. "
>
> the problem that I have two machines with two OS (32bit and 64bit) and when
> i used the 64bit OS the same error remains.
>
> Thank you if you have any suggestions to me and excuse me because i am a
> newbie.
>
> Here the default information for the 64bit os:
>
>   
>> sessionInfo()
>> 
> R version 2.7.1 (2008-06-23)
> x86_64-redhat-linux-gnu
>
>   
>> gc()
>> 
>  used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 137955  7.4 35 18.7   35 18.7
> Vcells 141455  1.1 786432  6.0   601347  4.6
>
> I tried also to start R using the options to control the available memory
> and the result still the same. or maybe i don't assign the correct values.
>
>   
It might be a good idea first to work out what the actual memory
requirements are. 64 bits does not help if you are running out of RAM
(+swap).

-- 
   O__   Peter Dalgaard Ă˜ster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
 (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
~~ - ([EMAIL PROTECTED])  FAX: (+45) 35327907

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory allocation problem with large dataset

2008-02-11 Thread Pamela Allen
Hello All, 

I have a problem when I try and run an nlme model with an added correlation
structure on a large dataset.  This is not surprising, but I am not sure how
to fix this problem.  I am using R 2.6.1, and I have had similar problems in
S-plus.

My dataset is mass growth data from the same 8 animals over 10 years, and is
a data.frame with 23,598 rows and 5 columns, and using object.size(data)=
664448 bytes.  My first model, using the VonBertalanffy growth function:

vanfemfunc<-function(age.years, A, lk, t0)
A*((1-exp(-exp(lk)*(age.years-t0)))^3)

vanfemfit1<-nlme(model=mass~vanfemfunc(age.years, A, lk, t0), data=vanfem,
fixed=A+lk+t0~1, start=c("A"=201.8, "lk"=-1.17, "t0"=-2.71))

This fits fine, and has an object size of 3605660 bytes.  When I try this
second model that involves a correlation structure for the random effects:

vanfemfit2<-update(vanfemfit1, correlation=corAR1())

I receive the following error message:

Error: cannot allocate vector of size 392.9 Mb
In addition: Warning messages:
1: In corFactor.corAR1(object) :
  Reached total allocation of 1022Mb: see help(memory.size)

I have looked through all of the R-help archives, but nothing that was
previously suggested has worked so far.  I have a 32-bit Windows OS with 1
Gb RAM, and I have also tried all of the following functions on a 64-bit
Windows OS with 2 Gb RAM.  Here are the functions I have tried:

1.  Using --max-mem-size.  I changed the upper limit to 2047, which was the
maximum value allowed for both the 32-bit OS and the 64-bit OS. The error
message for vanfemfit2 now reads "Error: cannot allocate vector of size
392.9 Mb".

2.  Using the function memory.limit(size=4095). If I increase it any higher
for both the 32-bit and 62-bit OS, I get the error message "Don't be silly!:
your machine has a 4 Gb address limit".  
Now when I try the model, the error message says "Error: cannot allocate
vector of size 392.9 Mb" .  Calling memory.size(max=T) produces 1126.44 Mb
for the 32-bit OS, but 438.38 for the 64-bit OS.  
Calling memory.size(max=F) after I try vanfemfit2 produces 1038.02 Mb for
the 32-bit, and 417.90 for the 64-bit. 
If I use gc() and then call memory.size(max=F) again, the output is 240.90
for the 32-bit, and 13.69 for the 64-bit.  
If I use gc() and then try vanfemfit2 again, I get the same error message on
both computers.  

I cannot split up the data into smaller pieces to make this function work,
as I need all of the data to produce a proper growth curve, although I have
reduced the data to 2,000 to test vanfemfit2, and it did work.  Any advice
would be much appreciated.  Thank you!

-Pam Allen
MSc Candidate
[EMAIL PROTECTED]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.