On Mon, 3 Mar 2008, Jose Funes wrote: > Dear members, > > I would like to share some of the problems that I have run into R when > importing ascii files. I described the problem below, if any have > experienced similar issues I will appreciate you support. > > I am working with raster data(maps) and importing it as ascii files to > R. I am stacking 9 layers (10MB~each, total 90MB). However, when > stacking the eight layer I get the following message "Error: cannot > allocate vector of size 10.8 Mb". I did a little of bit of reading > about this, and some suggestions are to increase the memory > allocation. I increased it to 3GB using the following command: > memory.limit(size=3000) but the problem still persist. >
Which platform? Windows? Its performance is systematically worse than other OS on the same hardware. When you say 10Mb, is this 8 bytes * roughly 1.2 million cells? Are you running gc() between calls to readAsciiGrid() to force garbage collection? While readAsciiGrid() is working, I think that at least three copies of the data are in memory. Have you tried using rgdal? Maybe readGDAL() keeps fewer internal copies? (in readGDAL, that may be at least two copies, but I'm not sure (NAs are handled by GDAL)). Because input needs to be converted from representation to representation, it is not possible to avoid multiple copies. > code: > 1. predictorslide2 <- readAsciiGrid("sol_spr_lide2.asc") > ... > ... > 8. predictorslide2 <- > cbind(predictorslide2,readAsciiGrid("umca_pr_lide2.asc")) > It may be possible to get round cbind() copying if that turns out to be the problem, by taking the SpatialGrid from the first read, putting it aside, and storing the single columns of the data slots of read data in a single, pre-allocated data frame. Re-allocating to predictorslide2 successively is not a good idea for large objects - Braun & Murdoch in their nice book on R programming treat it as a worst case on speed and memory usage. > Also I got a similar error when running regression kriging: > Error: cannot allocate vector of size 6.7 Mb Without the command, this isn't informative. Were you using local areas (maximum distance or # data points)? How large were the data= and newdata= objects? Roger > In addition: There were 12 warnings (use warnings() to see them) > > warning messages: > 1: Reached total allocation of 1535Mb: see help(memory.size) > ... > 3: In slot(value, what) <- slot(from, what) : > ... > 8: In slot(value, what) <- slot(from, what) : > Reached total allocation of 1535Mb: see help(memory.size) > ... > 12: In `slot<-`(`*tmp*`, what, value = structure(c(-293438.89990765, ... : > Reached total allocation of 1535Mb: see help(memory.size) > > I will appreciate your suggestions. I have also considered splitting > the data but would like to explore other solutions. > > Regards, > > Jose Funes > > _______________________________________________ > R-sig-Geo mailing list > R-sig-Geo@stat.math.ethz.ch > https://stat.ethz.ch/mailman/listinfo/r-sig-geo > -- Roger Bivand Economic Geography Section, Department of Economics, Norwegian School of Economics and Business Administration, Helleveien 30, N-5045 Bergen, Norway. voice: +47 55 95 93 55; fax +47 55 95 95 43 e-mail: [EMAIL PROTECTED] _______________________________________________ R-sig-Geo mailing list R-sig-Geo@stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-geo