If this is numeric, then for just storing one copy, you will require 86000 * 2500 * 8 = 1.7GB of memory. You should have 3-4X that if you want to analyze it, so you might need about 6GB of physical memory and a 64-bit version of R. Is there some other alternative? Do you need all the values at once, or can you use a database to access the portions you want?
On 2/18/08, Federico Calboli <[EMAIL PROTECTED]> wrote: > Hi All, > > is there a way of predicting memory usage? > > I need to build an array of 86000 by 2500 numbers (or I might create > a list of 2 by 2500 arrays 43000 long). How much memory should I > expect to use/need? > > Cheers, > > Fede > > -- > Federico C. F. Calboli > Department of Epidemiology and Public Health > Imperial College, St. Mary's Campus > Norfolk Place, London W2 1PG > > Tel +44 (0)20 75941602 Fax +44 (0)20 75943193 > > f.calboli [.a.t] imperial.ac.uk > f.calboli [.a.t] gmail.com > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > -- Jim Holtman Cincinnati, OH +1 513 646 9390 What is the problem you are trying to solve? ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.