[R] Memory issue?

2009-01-27 Thread Daniel Brewer
I have a script that sometimes produces the following error:

Error in assign(".target", met...@target, envir = envir) :
  formal argument "envir" matched by multiple actual arguments

Do you think this is a memory issue?  I don't know what else it could be
as it doesn't always occur even if the script is run with exactly the
same data.

Does rm() actually free up memory?

Thanks

Dan

-- 
**
Daniel Brewer, Ph.D.

Institute of Cancer Research
Molecular Carcinogenesis
Email: daniel.bre...@icr.ac.uk
**

The Institute of Cancer Research: Royal Cancer Hospital, a charitable Company 
Limited by Guarantee, Registered in England under Company No. 534147 with its 
Registered Office at 123 Old Brompton Road, London SW7 3RP.

This e-mail message is confidential and for use by the a...{{dropped:2}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory Issue

2007-09-19 Thread Yoni Stoffman
Hi, 

I'm new to R and there is something I'm missing about how it uses
memory. I'm doing a simple query (using RODBC package) and immediately
set the data.frame to null close the connection/channel and explicitly
call to the garbage collector (gc()) however when I look in the task
monitor I see both "VM size" and ""Mem Usage" increased every time (for
the RGui).

I tried this on different configurations: windowxp64 / windowsxp and R
version 2.4.1 and 2.5.1. 

What I'm doing wrong? 

Thanks, 
Yoni.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue?

2009-01-27 Thread Paul Hiemstra

Daniel Brewer wrote:

I have a script that sometimes produces the following error:

Error in assign(".target", met...@target, envir = envir) :
  formal argument "envir" matched by multiple actual arguments

Do you think this is a memory issue?  I don't know what else it could be
as it doesn't always occur even if the script is run with exactly the
same data.

Does rm() actually free up memory?

Thanks

Dan

  

Hi,

There are multiple threads on this subject on the R-help list, googling 
for "formal argument matched by mutiple actual arguments" lead me to:


http://tolstoy.newcastle.edu.au/R/help/05/08/10698.html

So this is probably not a memory issue. Freeing up memory can be done 
using gc().


cheers and hth,
Paul

--
Drs. Paul Hiemstra
Department of Physical Geography
Faculty of Geosciences
University of Utrecht
Heidelberglaan 2
P.O. Box 80.115
3508 TC Utrecht
Phone:  +31302535773
Fax:+31302531145
http://intamap.geo.uu.nl/~paul

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue?

2009-01-28 Thread Ubuntu Diego
I had similar issues with memory occupancy. You should explicitly call
gc() to call the garbage collector (free memory routine) after you do
rm() of the big objects. 

D.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] R memory issue for writing out the file

2008-04-15 Thread Xiaojing Wang
Hello, all,

First thanks in advance for helping me.

I am now handling a data frame, dimension 11095400 rows and 4 columns. It
seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
trying to write this file out using the command:

write.table(all,file="~/Desktop/alex.lgen",sep="
",row.names=F,na="0",quote=F,col.names=F)

I got the error message:

R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
code=3)
R(319,0xa000d000) malloc: *** error: can't allocate region
R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug


I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
seems that it has to do with my R memory limit allocation.

I read all the online help and still could not figure out the way to solve
the problem. Also I do not understand why the data could be easily handled
within R but could not write out due to the insufficient memory. I am not
good at both R and computers.  Sorry for my naive questions if it sounds
bothersome.


-- 
Xiaojing WANG
Dept. of Human Genetics
Univ. of Pittsburgh, PA 15261
Tel: 412-624-8157

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory issue for writing out the file

2008-04-15 Thread Martin Morgan
Hi Xiaojing,

That's a big table!

You might try 'write' (you'll have to work harder to get your data into 
an appropriate format).

You might also try the R-2.7 release candidate, which I think is 
available here

http://r.research.att.com/

for the mac. There was a change in R-2.7 that will make writing large 
tables without row names more efficient; this might well be where you 
are running in to problems.

Best,

Martin

Xiaojing Wang wrote:
> Hello, all,
> 
> First thanks in advance for helping me.
> 
> I am now handling a data frame, dimension 11095400 rows and 4 columns. It
> seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
> trying to write this file out using the command:
> 
> write.table(all,file="~/Desktop/alex.lgen",sep="
> ",row.names=F,na="0",quote=F,col.names=F)
> 
> I got the error message:
> 
> R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
> code=3)
> R(319,0xa000d000) malloc: *** error: can't allocate region
> R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
> 
> 
> I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
> seems that it has to do with my R memory limit allocation.
> 
> I read all the online help and still could not figure out the way to solve
> the problem. Also I do not understand why the data could be easily handled
> within R but could not write out due to the insufficient memory. I am not
> good at both R and computers.  Sorry for my naive questions if it sounds
> bothersome.
> 
> 


-- 
Martin Morgan
Computational Biology / Fred Hutchinson Cancer Research Center
1100 Fairview Ave. N.
PO Box 19024 Seattle, WA 98109

Location: Arnold Building M2 B169
Phone: (206) 667-2793

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory issue for writing out the file

2008-04-15 Thread Henrik Bengtsson
Try to write the data.frame to file in blocks of rows by calling
write.table() multiple times - see argument 'append' for
write.table().  That will probably require less memory.

/Henrik

On Tue, Apr 15, 2008 at 6:12 PM, Xiaojing Wang <[EMAIL PROTECTED]> wrote:
> Hello, all,
>
>  First thanks in advance for helping me.
>
>  I am now handling a data frame, dimension 11095400 rows and 4 columns. It
>  seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
>  trying to write this file out using the command:
>
>  write.table(all,file="~/Desktop/alex.lgen",sep="
>  ",row.names=F,na="0",quote=F,col.names=F)
>
>  I got the error message:
>
>  R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
>  code=3)
>  R(319,0xa000d000) malloc: *** error: can't allocate region
>  R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
>
>
>  I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
>  seems that it has to do with my R memory limit allocation.
>
>  I read all the online help and still could not figure out the way to solve
>  the problem. Also I do not understand why the data could be easily handled
>  within R but could not write out due to the insufficient memory. I am not
>  good at both R and computers.  Sorry for my naive questions if it sounds
>  bothersome.
>
>
>  --
>  Xiaojing WANG
>  Dept. of Human Genetics
>  Univ. of Pittsburgh, PA 15261
>  Tel: 412-624-8157
>
> [[alternative HTML version deleted]]
>
>  __
>  R-help@r-project.org mailing list
>  https://stat.ethz.ch/mailman/listinfo/r-help
>  PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>  and provide commented, minimal, self-contained, reproducible code.
>

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory issue for writing out the file

2008-04-15 Thread jim holtman
What are you going to do with the table after you write it out?  Are
you just going to read it back into R?  If so, have you tried using
'save'?

On Tue, Apr 15, 2008 at 12:12 PM, Xiaojing Wang <[EMAIL PROTECTED]> wrote:
> Hello, all,
>
> First thanks in advance for helping me.
>
> I am now handling a data frame, dimension 11095400 rows and 4 columns. It
> seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
> trying to write this file out using the command:
>
> write.table(all,file="~/Desktop/alex.lgen",sep="
> ",row.names=F,na="0",quote=F,col.names=F)
>
> I got the error message:
>
> R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
> code=3)
> R(319,0xa000d000) malloc: *** error: can't allocate region
> R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
>
>
> I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
> seems that it has to do with my R memory limit allocation.
>
> I read all the online help and still could not figure out the way to solve
> the problem. Also I do not understand why the data could be easily handled
> within R but could not write out due to the insufficient memory. I am not
> good at both R and computers.  Sorry for my naive questions if it sounds
> bothersome.
>
>
> --
> Xiaojing WANG
> Dept. of Human Genetics
> Univ. of Pittsburgh, PA 15261
> Tel: 412-624-8157
>
>[[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.