Re: [R] memory issue

2017-05-02 Thread Jeff Newmiller
Suggestions...

Post plain text (you reduce your own chances of getting feedback by failing to 
do this in your email program)

Provide sample data and code

Buy more RAM

use data.table package and fread

load and analyze subsets of data

Put the data into a database (e.g. sqlite?)

If these suggestions seem brief, or even if they don't, please be more explicit 
in your question. Read [1] and [2].

[1] 
http://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example

[2] http://adv-r.had.co.nz/Reproducibility.html
-- 
Sent from my phone. Please excuse my brevity.

On May 2, 2017 12:09:21 PM PDT, Amit Sengupta via R-help  
wrote:
>HI,I am unable to read a 2.4 gig file into a table (using read.table)
>in a 64 bit R environment. Do you have any suggestions?Amit 
>
>   [[alternative HTML version deleted]]
>
>__
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory issue

2017-05-02 Thread Amit Sengupta via R-help
HI,I am unable to read a 2.4 gig file into a table (using read.table) in a 64 
bit R environment. Do you have any suggestions?Amit 

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R Memory Issue

2016-02-17 Thread Sandeep Rana
Hi,
May be its reading your file and taking time which depends on size of the file 
that you are reading.
Please explore ‘data.table’ library to read big files in few seconds.

If you attempt to close the application while execution had been in progress 
for sometime it would take time most of the times.
Instead, end the r_session process from task manager which is immediate. 

Regards,
Sandeep S. Rana


> On 17-Feb-2016, at 2:46 PM, SHIVI BHATIA  wrote:
> 
> Dear Team,
> 
> 
> 
> Every now and then I face some weird issues with R. For instance it would
> not read my csv file or any other read.table command and once I would close
> the session and reopen again it works fine. 
> 
> 
> 
> It have tried using rm(list=ls()) & gc() to free some memory and restart R
> 
> 
> 
> 
> Also today while closing the R session it took more than 10 minutes. I am
> not sure as to what is leading to this. Kindly throw some light on this. Not
> sure if I have provided enough information.  
> 
> 
> 
> Thanks, Shivi
> 
> Mb: 9891002021
> 
> 
> 
> This e-mail is confidential. It may also be legally privileged. If you are 
> not the addressee you may not copy, forward, disclose or use any part of it. 
> If you have received this message in error, please delete it and all copies 
> from your system and notify the sender immediately by return e-mail. Internet 
> communications cannot be guaranteed to be timely, secure, error or 
> virus-free. The sender does not accept liability for any errors or omissions.
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] R Memory Issue

2016-02-17 Thread PIKAL Petr
Hi

I have this enhanced ls function, which evaluates size of objects generated by 
myself or by other functions sitting in my environment.

ls.objects <- function (pos = 1, pattern, order.by)
{
napply <- function(names, fn) sapply(names, function(x) fn(get(x,
pos = pos)))
names <- ls(pos = pos, pattern = pattern)
obj.class <- napply(names, function(x) as.character(class(x))[1])
obj.mode <- napply(names, mode)
obj.type <- ifelse(is.na(obj.class), obj.mode, obj.class)
obj.size <- napply(names, object.size)
obj.dim <- t(napply(names, function(x) as.numeric(dim(x))[1:2]))
vec <- is.na(obj.dim)[, 1] & (obj.type != "function")
obj.dim[vec, 1] <- napply(names, length)[vec]
out <- data.frame(obj.type, obj.size, obj.dim)
names(out) <- c("Type", "Size", "Rows", "Columns")
if (!missing(order.by))
out <- out[order(out[[order.by]]), ]
out
}

Lengthy R closing can be due to such big objects e.g. generated by strucchange 
functions.

However it may have another resons. Without more information from your side it 
would be difficult to bring definite answer.

Petr


> -Original Message-
> From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of SHIVI
> BHATIA
> Sent: Wednesday, February 17, 2016 10:16 AM
> To: r-help@r-project.org
> Subject: [R] R Memory Issue
>
> Dear Team,
>
>
>
> Every now and then I face some weird issues with R. For instance it
> would not read my csv file or any other read.table command and once I
> would close the session and reopen again it works fine.
>
>
>
> It have tried using rm(list=ls()) & gc() to free some memory and
> restart R <Cntrl+Shft+F10>
>
>
>
> Also today while closing the R session it took more than 10 minutes. I
> am not sure as to what is leading to this. Kindly throw some light on
> this. Not
> sure if I have provided enough information.
>
>
>
> Thanks, Shivi
>
> Mb: 9891002021
>
>



Tento e-mail a jakékoliv k němu připojené dokumenty jsou důvěrné a jsou určeny 
pouze jeho adresátům.
Jestliže jste obdržel(a) tento e-mail omylem, informujte laskavě neprodleně 
jeho odesílatele. Obsah tohoto emailu i s přílohami a jeho kopie vymažte ze 
svého systému.
Nejste-li zamýšleným adresátem tohoto emailu, nejste oprávněni tento email 
jakkoliv užívat, rozšiřovat, kopírovat či zveřejňovat.
Odesílatel e-mailu neodpovídá za eventuální škodu způsobenou modifikacemi či 
zpožděním přenosu e-mailu.

V případě, že je tento e-mail součástí obchodního jednání:
- vyhrazuje si odesílatel právo ukončit kdykoliv jednání o uzavření smlouvy, a 
to z jakéhokoliv důvodu i bez uvedení důvodu.
- a obsahuje-li nabídku, je adresát oprávněn nabídku bezodkladně přijmout; 
Odesílatel tohoto e-mailu (nabídky) vylučuje přijetí nabídky ze strany příjemce 
s dodatkem či odchylkou.
- trvá odesílatel na tom, že příslušná smlouva je uzavřena teprve výslovným 
dosažením shody na všech jejích náležitostech.
- odesílatel tohoto emailu informuje, že není oprávněn uzavírat za společnost 
žádné smlouvy s výjimkou případů, kdy k tomu byl písemně zmocněn nebo písemně 
pověřen a takové pověření nebo plná moc byly adresátovi tohoto emailu případně 
osobě, kterou adresát zastupuje, předloženy nebo jejich existence je adresátovi 
či osobě jím zastoupené známá.

This e-mail and any documents attached to it may be confidential and are 
intended only for its intended recipients.
If you received this e-mail by mistake, please immediately inform its sender. 
Delete the contents of this e-mail with all attachments and its copies from 
your system.
If you are not the intended recipient of this e-mail, you are not authorized to 
use, disseminate, copy or disclose this e-mail in any manner.
The sender of this e-mail shall not be liable for any possible damage caused by 
modifications of the e-mail or by delay with transfer of the email.

In case that this e-mail forms part of business dealings:
- the sender reserves the right to end negotiations about entering into a 
contract in any time, for any reason, and without stating any reasoning.
- if the e-mail contains an offer, the recipient is entitled to immediately 
accept such offer; The sender of this e-mail (offer) excludes any acceptance of 
the offer on the part of the recipient containing any amendment or variation.
- the sender insists on that the respective contract is concluded only upon an 
express mutual agreement on all its aspects.
- the sender of this e-mail informs that he/she is not authorized to enter into 
any contracts on behalf of the company except for cases in which he/she is 
expressly authorized to do so in writing, and such authorization or power of 
attorney is submitted to the recipient or the person represented by the 
recipient, or the existence 

[R] R Memory Issue

2016-02-17 Thread SHIVI BHATIA
Dear Team,

 

Every now and then I face some weird issues with R. For instance it would
not read my csv file or any other read.table command and once I would close
the session and reopen again it works fine. 

 

It have tried using rm(list=ls()) & gc() to free some memory and restart R


 

Also today while closing the R session it took more than 10 minutes. I am
not sure as to what is leading to this. Kindly throw some light on this. Not
sure if I have provided enough information.  

 

Thanks, Shivi

Mb: 9891002021

 

This e-mail is confidential. It may also be legally privileged. If you are not 
the addressee you may not copy, forward, disclose or use any part of it. If you 
have received this message in error, please delete it and all copies from your 
system and notify the sender immediately by return e-mail. Internet 
communications cannot be guaranteed to be timely, secure, error or virus-free. 
The sender does not accept liability for any errors or omissions.
__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] Memory issue with svm modeling in R

2012-10-23 Thread Jessica Streicher
Well, i'm no expert on these topics, but if its 2.7 gig and R can maximally use 
2gig, then the easiest solution would be giving R more memory. Did you read 
through help(memory.size) as the error suggested?

try calling memory.size(T) or memory.limit(3000) and see if it works.

I don't have any experience with either Rstudio or Amazon whatever. The local 
system seems to be windows so the above might work, don't know the other, you 
might need to change the memory limit at startup of the console if its not.

On 22.10.2012, at 10:18, Vignesh Prajapati wrote:

 Hello Jessica,
 
 Thanks for inform this and  very sorry for inconvenience, Here I have 
 attached two Files
 1.  crash.png- For Issue with Amazon Instance
 2.  localmachine_error.bmp - for Issue with local machine
 
 Thanks
 
 On Mon, Oct 22, 2012 at 1:42 PM, Jessica Streicher j.streic...@micromata.de 
 wrote:
 Hello Vignesh, we did not get any attachments, maybe you could upload them 
 somewhere?
 
 On 19.10.2012, at 09:46, Vignesh Prajapati wrote:
 
  As I found the memory problem with local machine/micro instance(amazon) for
  building SVM model in R on large dataset(2,01,478 rows with 11 variables),
  then I have migrated our micro instance to large instance at Amazon. Still
  I have memory issue with large amazon instance while developing R model for
  this dataset due to large size. I have attached the snap of error with
  local machine(localmachine_error.bmp) and amazon instance(crash.png ) with
  this post.
 
  Issue on local Machine ::
 
  [image: enter image description here]
 
  Issue on Amazon large Instance ::
 
  [image: enter image description here]
 
  Can any one suggest me for the solution of this issue.?
 
  Thanks
 
  Vignesh
 
[[alternative HTML version deleted]]
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.
 
 
 crash.pnglocalmachine_error.bmp

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue with svm modeling in R

2012-10-22 Thread Jessica Streicher
Hello Vignesh, we did not get any attachments, maybe you could upload them 
somewhere?

On 19.10.2012, at 09:46, Vignesh Prajapati wrote:

 As I found the memory problem with local machine/micro instance(amazon) for
 building SVM model in R on large dataset(2,01,478 rows with 11 variables),
 then I have migrated our micro instance to large instance at Amazon. Still
 I have memory issue with large amazon instance while developing R model for
 this dataset due to large size. I have attached the snap of error with
 local machine(localmachine_error.bmp) and amazon instance(crash.png ) with
 this post.
 
 Issue on local Machine ::
 
 [image: enter image description here]
 
 Issue on Amazon large Instance ::
 
 [image: enter image description here]
 
 Can any one suggest me for the solution of this issue.?
 
 Thanks
 
 Vignesh
 
   [[alternative HTML version deleted]]
 
 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory issue with svm modeling in R

2012-10-19 Thread Vignesh Prajapati
As I found the memory problem with local machine/micro instance(amazon) for
building SVM model in R on large dataset(2,01,478 rows with 11 variables),
then I have migrated our micro instance to large instance at Amazon. Still
I have memory issue with large amazon instance while developing R model for
this dataset due to large size. I have attached the snap of error with
local machine(localmachine_error.bmp) and amazon instance(crash.png ) with
this post.

Issue on local Machine ::

 [image: enter image description here]

Issue on Amazon large Instance ::

[image: enter image description here]

Can any one suggest me for the solution of this issue.?

Thanks

Vignesh

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory issue. XXXX

2012-03-02 Thread Dan Abner
Hi everyone,

Any ideas on troubleshooting this memory issue:

 d1-read.csv(arrears.csv)
Error: cannot allocate vector of size 77.3 Mb
In addition: Warning messages:
1: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
2: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
3: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
4: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)


Thanks!

Dan

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue. XXXX

2012-03-02 Thread Sarah Goslee
Let's see...

You could delete objects from your R session.
You could buy more RAM.
You could see help(memory.size).
You could try googling to see how others have dealt with memory
management in R, a process which turns up useful information like
this: http://www.r-bloggers.com/memory-management-in-r-a-few-tips-and-tricks/

You could provide the information on your system requested in the posting guide.

Sarah

On Fri, Mar 2, 2012 at 9:57 AM, Dan Abner dan.abne...@gmail.com wrote:
 Hi everyone,

 Any ideas on troubleshooting this memory issue:

 d1-read.csv(arrears.csv)
 Error: cannot allocate vector of size 77.3 Mb
 In addition: Warning messages:
 1: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
 2: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
 3: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
 4: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)


 Thanks!

 Dan


-- 
Sarah Goslee
http://www.functionaldiversity.org

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue. XXXX

2012-03-02 Thread steven mosher
1. How much RAM do you have (looks like 2GB ) . If you have more than 2GB
then you can allocate
more memory with memory.size()

2. If you have 2GB or less then you have a couple options

a) make sure your session is clean of unnecessary objects.
b) Dont read in all the data if you dont need to ( see colClasses  to
control this )
c) use the bigmemory package or ff package
d) buy more RAM


On Fri, Mar 2, 2012 at 6:57 AM, Dan Abner dan.abne...@gmail.com wrote:

 Hi everyone,

 Any ideas on troubleshooting this memory issue:

  d1-read.csv(arrears.csv)
 Error: cannot allocate vector of size 77.3 Mb
 In addition: Warning messages:
 1: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
 2: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
 3: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
 4: In class(data) - data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)


 Thanks!

 Dan

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue. XXXX

2012-03-02 Thread Prof Brian Ripley

On 02/03/2012 23:36, steven mosher wrote:

1. How much RAM do you have (looks like 2GB ) . If you have more than 2GB
then you can allocate
 more memory with memory.size()


Actually, this looks like 32-bit Windows (unstated), so you cannot.  See 
the rw-FAQ for things your sysadmin can do even there.



2. If you have 2GB or less then you have a couple options

 a) make sure your session is clean of unnecessary objects.
 b) Dont read in all the data if you dont need to ( see colClasses  to
control this )
 c) use the bigmemory package or ff package
 d) buy more RAM


Most importantly, use a 64-bit OS to get a larger real address space. 
(bigmemory and ff are mainly palliative measures for those whose OS does 
not provide a good implementation of out-of-memory objects).




On Fri, Mar 2, 2012 at 6:57 AM, Dan Abnerdan.abne...@gmail.com  wrote:


Hi everyone,

Any ideas on troubleshooting this memory issue:


d1-read.csv(arrears.csv)

Error: cannot allocate vector of size 77.3 Mb
In addition: Warning messages:
1: In class(data)- data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
2: In class(data)- data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
3: In class(data)- data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)
4: In class(data)- data.frame :
  Reached total allocation of 1535Mb: see help(memory.size)


Thanks!

Dan

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Issue

2010-08-24 Thread Cuckovic Paik

Thanks for constrctive comments. I was very careful when I wrote the code. I
wrote many functions and then wrapped up to get a single function.
Originally, I used optim() to get MLE, it was at least 10 times slower than
the code based on Newton method. I also vectorized all objects whenever
possible. 
-- 
View this message in context: 
http://r.789695.n4.nabble.com/Memory-Issue-tp2335860p2336687.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory Issue

2010-08-23 Thread Cuckovic Paik

Dear All,

I have an issue on memory use in R programming. 

Here is the brief story: I want to simulate the power of a nonparameteric
test and compare it with the existing tests. The basic steps are

1. I need to use Newton method to obtain the nonparametric MLE that involves
the inversion of a large matrix (n-by-n matrix, it takes about less than 3
seconds in average to get the MLE. n = sample size)


2. Since the test statistic has an unknown sample distribution, the p-value
is simmulated using Monte Carlo (1000 runs). it takes about 3-4 minutes to
get an p-value.


3. I need to simulate 1000 random samples and reapte steps 1 and 2 to get
the p-value for each of the simulated samples to get the power of the test.


Here is the question:

It initially completes 5-6 simulations per hour, after that, the time needed
to complete a single simulation increases exponentially. After a 24 hour
running, I only get about 15-20 simulations completed. My computer is a PC
(Pentium Dual Core CPU 2.5 GHz, RAM 6.00GB, 64-bit). Appearently, the memory
is the problem. 

I also tried various memory re-allocation procedures, They didn't work. Can
anyboy help on this? Thanks in advance.


-- 
View this message in context: 
http://r.789695.n4.nabble.com/Memory-Issue-tp2335860p2335860.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Issue

2010-08-23 Thread Dennis Murphy
Hi:

Are you running 32-bit or 64-bit R? For memory-intensive processes like
these, 64-bit R is almost a necessity. You might also look into more
efficient ways to invert the matrix, especially if it has special properties
that can be exploited (e.g., symmetry). More to the point, you want to
compute the nonparametric MLE as efficiently as you can, since it affects
everything downstream. In addition, if you're trying to do all of this in a
single function, it may be better to break the job up into several
functions, one for each task, with a wrapper function to put them together
(i.e., modularize).

Memory problems in R often arise from repeatedly copying objects in memory
while accumulating promises in a loop that do not get evaluated until the
end. Forcing evaluations or performing garbage collection at judicious
points can improve efficiency. Pre-allocating memory to result objects is
more efficient than adding a new element to an output vector or matrix every
iteration. Vectorizing where you can is critical.

Since you didn't provide any code, one is left to speculate where the
bottleneck(s) in your code lie(s), but here's a little example I did for
someone recently that shows how much vectorization and pre-allocation of
memory can make a difference:

# Problem: Simulate 1000 U(0, 1) random numbers, discretize them
# into a factor and generate a table.

# vectorized version using cut()
f - function() {
   x - runif(1000)
   z - cut(x, breaks = c(-0.1, 0.1, 0.2, 0.4, 0.7, 0.9, 1), labels = 1:6)
   table(z)
  }

# use ifelse(), a vectorized function, to divide into groups
g - function() {
x - runif(1000)
z - ifelse(x = 0.1, '1', ifelse(x  0.1  x = 0.2, '2',
 ifelse(x  0.2  x = 0.4, '3',
  ifelse(x  0.4  x = 0.7, '4',
   ifelse(x  0.7  x = 0.9, '5', '6')
table(z)
  }

# Elementwise loop with preallocation of memory
h - function() {
 x - runif(1000)
 z - character(1000)   #  ==
  for(i in 1:1000) {
   z[i] - if(x[i] = 0.1) '1' else
   if(x[i]  0.1  x[i] = 0.2) '2' else
   if(x[i]  0.2  x[i] = 0.4) '3' else
   if(x[i]  0.4  x[i] = 0.7) '4' else
   if(x[i]  0.7  x[i] = 0.9) '5' else '6'
}
  table(z)
  }

# Same as h() w/o memory preallocation
h2 - function() {
 x - runif(1000)
  for(i in 1:1000) {
   z[i] - if(x[i] = 0.1) '1' else
   if(x[i]  0.1  x[i] = 0.2) '2' else
   if(x[i]  0.2  x[i] = 0.4) '3' else
   if(x[i]  0.4  x[i] = 0.7) '4' else
   if(x[i]  0.7  x[i] = 0.9) '5' else '6'
}
  table(z)
  }

# Same as h(), but initialize with an empty vector
h3 - function() {
 x - runif(1000)
 z - character(0)# empty vector
  for(i in 1:1000) {
   z[i] - if(x[i] = 0.1) '1' else
   if(x[i]  0.1  x[i] = 0.2) '2' else
   if(x[i]  0.2  x[i] = 0.4) '3' else
   if(x[i]  0.4  x[i] = 0.7) '4' else
   if(x[i]  0.7  x[i] = 0.9) '5' else '6'
}
  table(z)
  }

## Timings using the function replicate():

 system.time(replicate(1000, f()))
   user  system elapsed
   1.140.041.20
 system.time(replicate(1000, g()))
   user  system elapsed
   3.900.003.92
 system.time(replicate(1000, h()))
   user  system elapsed
   9.240.009.26
 system.time(replicate(1000, h2()))
   user  system elapsed
  15.490.00   15.55
 system.time(replicate(1000, h3()))
   user  system elapsed
  15.600.03   15.68

The vectorized version is over three times as fast as the vectorized
ifelse() approach, and the vectorized ifelse() is almost three times as fast
as the preallocated memory, non-vectorized approach. The h* functions are
all non-vectorized, but differ in the way they initialize memory for output
objects. Full preallocation of memory (h) takes about 60%  as long as the
non-preallocated memory versions. Initializing an empty vector is about as
fast as no initialization at all. The effects of vectorization and the use
of pre-allocated memory for result objects filled in a loop are clear.

If you're carrying around copies of a large n x n matrix in memory over a
number of iterations of a loop, you are certainly going to gobble up
available memory, no matter how much you have. You can see the result in a
much simpler problem above. I'd recommend that you invest some time
improving the efficiency of the MLE function. Profiling tools like Rprof()
is one place to start - you can find tutorial material on the web in various
places on the topic (try Googling 'Profiling R functions'), as well as some
past discussion in this forum. Use RSiteSearch() and/or search the mail
archives for information there.

HTH,
Dennis

On Mon, Aug 23, 2010 at 2:44 PM, Cuckovic Paik cuckovic.p...@gmail.comwrote:


 Dear All,

 I have an issue on memory use in R programming.

 Here is the brief story: I want to simulate the power of a nonparameteric
 test and compare it with the existing tests. The basic steps are


[R] Memory issue

2010-05-05 Thread Alex van der Spek
Reading a flat text file 138 Mbyte large into R with a combination of 
scan (to get the header) and read.table. After conversion of text time 
stamps to POSIXct and conversion of integer codes to factors I convert 
everything into one data frame and release the old structures containing 
the data by using rm().


Strangely, the rm() does not appear to reduce the used memory. I checked 
using memory.size(). Worse still, the amount of memory required grows. 
When I save an image the .RData image file is only 23 Mbyte, yet at some 
point in to the program, after having done nothing particularly 
difficult (two and three way frequency tables and some lattice graphs) 
the amount of memory in use is over 1 Gbyte.


Not yet a problem, but it will become a problem. This is using R2.10.0 
on Windows Vista.


Does anybody know how to release memory as rm(dat) does not appear to do 
this properly.


Regards,
Alex van der Spek

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue

2010-05-05 Thread Prof Brian Ripley

On Wed, 5 May 2010, Alex van der Spek wrote:

Reading a flat text file 138 Mbyte large into R with a combination of scan 
(to get the header) and read.table. After conversion of text time stamps to 
POSIXct and conversion of integer codes to factors I convert everything into 
one data frame and release the old structures containing the data by using 
rm().


Strangely, the rm() does not appear to reduce the used memory. I checked 
using memory.size(). Worse still, the amount of memory required grows. When I 
save an image the .RData image file is only 23 Mbyte, yet at some point in to 
the program, after having done nothing particularly difficult (two and three 
way frequency tables and some lattice graphs) the amount of memory in use is 
over 1 Gbyte.


Not yet a problem, but it will become a problem. This is using R2.10.0 on 
Windows Vista.


Does anybody know how to release memory as rm(dat) does not appear to do this 
properly.


Rather, you do not appear to understand 'properly'.

First, you need to garbage-collect to find how much memory is 
available for re-use.  R does that internally as needed, but you can 
force it with gc().


Second, there is simply no reason for R not to use 'over 1 Gbyte' if 
it is available (and it was).  Using lots of memory is faster, but the 
garbage collector will clean up when needed.  The likely bottleneck 
for you is not the amount of memory used but fragmentation of the 
limited address space on 32-bit Windows.  See the documentation 


Third, the .RData file is (by default) compressed.

And fourth, 'releasing memory' usually means giving it back to the OS. 
That is an implementation detail and C runtime memory managers on many 
builds of R either never do so or do so tardily.  This is again not an 
issue unless your system is short of virtual memory and given how 
cheap disc space is, there is no reason to be so.




Regards,
Alex van der Spek

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue

2010-05-05 Thread Alex van der Spek

Thank you all,

No offense meant. I like R tremendously but I admit I am only a 
beginner. I did not know about gc(), but it explains my confusion about 
rm() not doing what I expected it to do.


I suspected that .RData was a compressed file. Thanks for the 
confirmation. As for Windows, unfortunately it is not upon me to choose 
the system.


Alex van der Spek

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue

2010-05-05 Thread kMan
Dear Alex,

Has manual garbage collection had any effect?

Sincerely,
KeithC.

-Original Message-
From: Alex van der Spek [mailto:do...@xs4all.nl] 
Sent: Wednesday, May 05, 2010 3:48 AM
To: r-help@r-project.org
Subject: [R] Memory issue

Reading a flat text file 138 Mbyte large into R with a combination of scan
(to get the header) and read.table. After conversion of text time stamps to
POSIXct and conversion of integer codes to factors I convert everything into
one data frame and release the old structures containing the data by using
rm().

Strangely, the rm() does not appear to reduce the used memory. I checked
using memory.size(). Worse still, the amount of memory required grows. 
When I save an image the .RData image file is only 23 Mbyte, yet at some
point in to the program, after having done nothing particularly difficult
(two and three way frequency tables and some lattice graphs) the amount of
memory in use is over 1 Gbyte.

Not yet a problem, but it will become a problem. This is using R2.10.0 on
Windows Vista.

Does anybody know how to release memory as rm(dat) does not appear to do
this properly.

Regards,
Alex van der Spek

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] R memory issue / quantreg

2010-01-28 Thread Dan Rabosky

Hi -

I also posted this on r-sig-ecology to little fanfare, so I'm trying  
here. I've recently hit an apparent R issue that I cannot resolve (or  
understand, actually).

I am using the quantreg package (quantile regression) to fit a vector  
of quantiles to a dataset, approx 200-400 observations. To  
accommodate some autocorrelation issues, I have to assess  
significance with randomization. The problem is that I consistently  
observe what appears to be a memory problem causing an R crash. The  
problem occurs within a local function I am using to (i) randomize  
the data and (ii) run quantile regression on the randomized dataset.

The crash only occurs (or so it seems) when I try send rq() [ =  
quantreg workhorse function ] a vector of quantiles to fit. Even when  
I use the same random number seed, the crash occurs on different  
iterations of the simulation. It sometimes occurs before rq() is  
called within the local function, and sometimes after rq() is called  
within the local function. Sometimes it occurs after returned to the  
main function. It does occur at approximately (but not necessarily)  
the same iteration, though.

I cannot explain this. I consider this to be a fairly small dataset;  
others use this with many thousands of points. And why does this  
occur at roughly the same iteration every time? That would suggest  
that the memory issue is cumulative - shouldn't any memory consumed  
within rq(...) be freed up after I return???

This is occurring with R 2.10.1 on a 64 bit machine running OSX  
10.6.2 (6 GB RAM).

Thanks!
~Dan Rabosky






[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue?

2009-01-28 Thread Ubuntu Diego
I had similar issues with memory occupancy. You should explicitly call
gc() to call the garbage collector (free memory routine) after you do
rm() of the big objects. 

D.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory issue?

2009-01-27 Thread Daniel Brewer
I have a script that sometimes produces the following error:

Error in assign(.target, met...@target, envir = envir) :
  formal argument envir matched by multiple actual arguments

Do you think this is a memory issue?  I don't know what else it could be
as it doesn't always occur even if the script is run with exactly the
same data.

Does rm() actually free up memory?

Thanks

Dan

-- 
**
Daniel Brewer, Ph.D.

Institute of Cancer Research
Molecular Carcinogenesis
Email: daniel.bre...@icr.ac.uk
**

The Institute of Cancer Research: Royal Cancer Hospital, a charitable Company 
Limited by Guarantee, Registered in England under Company No. 534147 with its 
Registered Office at 123 Old Brompton Road, London SW7 3RP.

This e-mail message is confidential and for use by the a...{{dropped:2}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory issue?

2009-01-27 Thread Paul Hiemstra

Daniel Brewer wrote:

I have a script that sometimes produces the following error:

Error in assign(.target, met...@target, envir = envir) :
  formal argument envir matched by multiple actual arguments

Do you think this is a memory issue?  I don't know what else it could be
as it doesn't always occur even if the script is run with exactly the
same data.

Does rm() actually free up memory?

Thanks

Dan

  

Hi,

There are multiple threads on this subject on the R-help list, googling 
for formal argument matched by mutiple actual arguments lead me to:


http://tolstoy.newcastle.edu.au/R/help/05/08/10698.html

So this is probably not a memory issue. Freeing up memory can be done 
using gc().


cheers and hth,
Paul

--
Drs. Paul Hiemstra
Department of Physical Geography
Faculty of Geosciences
University of Utrecht
Heidelberglaan 2
P.O. Box 80.115
3508 TC Utrecht
Phone:  +31302535773
Fax:+31302531145
http://intamap.geo.uu.nl/~paul

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] R memory issue for writing out the file

2008-04-15 Thread Xiaojing Wang
Hello, all,

First thanks in advance for helping me.

I am now handling a data frame, dimension 11095400 rows and 4 columns. It
seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
trying to write this file out using the command:

write.table(all,file=~/Desktop/alex.lgen,sep=
,row.names=F,na=0,quote=F,col.names=F)

I got the error message:

R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
code=3)
R(319,0xa000d000) malloc: *** error: can't allocate region
R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug


I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
seems that it has to do with my R memory limit allocation.

I read all the online help and still could not figure out the way to solve
the problem. Also I do not understand why the data could be easily handled
within R but could not write out due to the insufficient memory. I am not
good at both R and computers.  Sorry for my naive questions if it sounds
bothersome.


-- 
Xiaojing WANG
Dept. of Human Genetics
Univ. of Pittsburgh, PA 15261
Tel: 412-624-8157

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory issue for writing out the file

2008-04-15 Thread Martin Morgan
Hi Xiaojing,

That's a big table!

You might try 'write' (you'll have to work harder to get your data into 
an appropriate format).

You might also try the R-2.7 release candidate, which I think is 
available here

http://r.research.att.com/

for the mac. There was a change in R-2.7 that will make writing large 
tables without row names more efficient; this might well be where you 
are running in to problems.

Best,

Martin

Xiaojing Wang wrote:
 Hello, all,
 
 First thanks in advance for helping me.
 
 I am now handling a data frame, dimension 11095400 rows and 4 columns. It
 seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
 trying to write this file out using the command:
 
 write.table(all,file=~/Desktop/alex.lgen,sep=
 ,row.names=F,na=0,quote=F,col.names=F)
 
 I got the error message:
 
 R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
 code=3)
 R(319,0xa000d000) malloc: *** error: can't allocate region
 R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
 
 
 I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
 seems that it has to do with my R memory limit allocation.
 
 I read all the online help and still could not figure out the way to solve
 the problem. Also I do not understand why the data could be easily handled
 within R but could not write out due to the insufficient memory. I am not
 good at both R and computers.  Sorry for my naive questions if it sounds
 bothersome.
 
 


-- 
Martin Morgan
Computational Biology / Fred Hutchinson Cancer Research Center
1100 Fairview Ave. N.
PO Box 19024 Seattle, WA 98109

Location: Arnold Building M2 B169
Phone: (206) 667-2793

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory issue for writing out the file

2008-04-15 Thread Henrik Bengtsson
Try to write the data.frame to file in blocks of rows by calling
write.table() multiple times - see argument 'append' for
write.table().  That will probably require less memory.

/Henrik

On Tue, Apr 15, 2008 at 6:12 PM, Xiaojing Wang [EMAIL PROTECTED] wrote:
 Hello, all,

  First thanks in advance for helping me.

  I am now handling a data frame, dimension 11095400 rows and 4 columns. It
  seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
  trying to write this file out using the command:

  write.table(all,file=~/Desktop/alex.lgen,sep=
  ,row.names=F,na=0,quote=F,col.names=F)

  I got the error message:

  R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
  code=3)
  R(319,0xa000d000) malloc: *** error: can't allocate region
  R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug


  I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
  seems that it has to do with my R memory limit allocation.

  I read all the online help and still could not figure out the way to solve
  the problem. Also I do not understand why the data could be easily handled
  within R but could not write out due to the insufficient memory. I am not
  good at both R and computers.  Sorry for my naive questions if it sounds
  bothersome.


  --
  Xiaojing WANG
  Dept. of Human Genetics
  Univ. of Pittsburgh, PA 15261
  Tel: 412-624-8157

 [[alternative HTML version deleted]]

  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R memory issue for writing out the file

2008-04-15 Thread jim holtman
What are you going to do with the table after you write it out?  Are
you just going to read it back into R?  If so, have you tried using
'save'?

On Tue, Apr 15, 2008 at 12:12 PM, Xiaojing Wang [EMAIL PROTECTED] wrote:
 Hello, all,

 First thanks in advance for helping me.

 I am now handling a data frame, dimension 11095400 rows and 4 columns. It
 seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
 trying to write this file out using the command:

 write.table(all,file=~/Desktop/alex.lgen,sep=
 ,row.names=F,na=0,quote=F,col.names=F)

 I got the error message:

 R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
 code=3)
 R(319,0xa000d000) malloc: *** error: can't allocate region
 R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug


 I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
 seems that it has to do with my R memory limit allocation.

 I read all the online help and still could not figure out the way to solve
 the problem. Also I do not understand why the data could be easily handled
 within R but could not write out due to the insufficient memory. I am not
 good at both R and computers.  Sorry for my naive questions if it sounds
 bothersome.


 --
 Xiaojing WANG
 Dept. of Human Genetics
 Univ. of Pittsburgh, PA 15261
 Tel: 412-624-8157

[[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.