If someone facing problem to analyze large data set in gstat in Windows
due to memory issue, they can try in Ubuntu using "R-gedit". It is
very easy to install Ubuntu in a Windows machine (as a dual booting
system- not as a guest OS!). After installing r with all necessary
packages user
: Robert Hijmans [mailto:r.hijm...@gmail.com]
Sent: Thursday, January 29, 2009 7:58 PM
To: Herr, Alexander Herr - Herry (CSE, Gungahlin)
Cc: r-sig-geo@stat.math.ethz.ch
Subject: Re: [R-sig-Geo] memory issue on R with Linux 64
Herry,
This is how you can do it in package{raster}. (revision 209; a build
besma [mailto:edzer.pebe...@uni-muenster.de]
Sent: Thursday, January 29, 2009 6:01 PM
To: Herr, Alexander Herr - Herry (CSE, Gungahlin)
Cc: r-sig-geo@stat.math.ethz.ch
Subject: Re: [R-sig-Geo] memory issue on R with Linux 64
Well, this doesn't come as a surprise; if it did for you then you didn't read
th
On Thu, 29 Jan 2009, Robert Hijmans wrote:
Herry,
This is how you can do it in package{raster}. (revision 209; a build
should be available within 24 hours).
Following Edzer's example:
require(raster)
library(maptools)
# read a SpatialPolygonsDataFrame
nc <- readShapePoly(system.file("shapes/
Herry,
This is how you can do it in package{raster}. (revision 209; a build
should be available within 24 hours).
Following Edzer's example:
require(raster)
library(maptools)
# read a SpatialPolygonsDataFrame
nc <- readShapePoly(system.file("shapes/sids.shp",
package="maptools")[1], proj4
On Thu, 29 Jan 2009, alexander.h...@csiro.au wrote:
Hi List,
I get an error using readGDAL{rgdal}: cannot allocate vector of size 3.1
Gb
This is a tile of your 73K by 80K raster, right? One possibility is to use
smaller tiles, another to get more memory (as Edzer wrote), a third to use
lo
Well, this doesn't come as a surprise; if it did for you then you didn't
read the list archives well.
R has been designed for analysing statistical data, which usually
doesn't outnumber billions of observations, and not for
analysis/processing of large grids/imagery.
rgdal has infrastructure
Hi List,
I get an error using readGDAL{rgdal}: cannot allocate vector of size 3.1 Gb
I am using Linux 64bit (opensuse 11) with 4 gig swap and 4 gig Ram and R 2.8.0.
The load monitor shows that most of Ram is used up and then when Swap use
starts increasing, R returns the error.
Is there anyt
On Mon, 3 Nov 2008, Mikael Carlsson wrote:
Hi,
First, thanks for all help so far...but I still have some problems.
Here is the code:
cumark <- read.table("k:/test/helamarkcu.txt", header=T, sep = ";")
One copy of 3*40K*8 = 1MB
memory.limit(size = 1897)
attach(cumark)
vew <- as.vecto
reasonable answer can be extracted from a given body of
data.
~ John Tukey
Van: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Namens Mikael Carlsson
Verzonden: maandag 3 november 2008 23:54
Aan: r-sig-geo@stat.math.ethz.ch
Onderwerp: [R-sig-Geo] Memory issue
Hi
Hi, First, thanks for all help so far...but I still have some problems.Here is the code:cumark <- read.table("k:/test/helamarkcu.txt", header=T, sep = ";")memory.limit(size = 1897) attach(cumark)vew <- as.vector(ew)vns <- as.vector(ns)vcu <- as.vector(cu)fcumark <- data.frame(vew, vns, vcu)coord
11 matches
Mail list logo