I don't know if this will exactly apply to your case, but you mention pixel by 
pixel analysis.  If that is really all you need and there are no 
texture/neighborhood analyses going on, then you could simply open the file and 
use readBin to grab an appropriately sized chunk, analyze it, writeBin to your 
output file, then move on.  

Right now I'm exporting about a dozen rasters from ArcMap to binary floating 
point files (.flt), reading one row at a time into R with readBin(filehandle, 
double(), size=4, n=ncols) for each raster, and looping through all rows to 
output a manually constructed classification.  I've done similar things with 
ENVI files in the past (which when not compressed are simple, straight-forward 
binary files).  I've just used single-band files for simplicity sake;  for 
multi-band images you would need to manage the interleaving pattern (e.g. BSQ, 
BIL).

-Eric

------------------------------

Message: 10
Date: Tue, 29 Jul 2008 16:15:34 -0400
From: "Guy Serbin" <[EMAIL PROTECTED]>
Subject: Re: [R-sig-Geo] ENVI data and R
To: r-sig-geo@stat.math.ethz.ch 
Message-ID:
        <[EMAIL PROTECTED]>
Content-Type: text/plain; charset=ISO-8859-1

Thank you all for the help- I successfully read an image into R using
these methods.

I did, however, encounter some problems when loading a hyperspectral
image cube into R as it was unable to allocate the 2.9 GB of volatile
memory that it needed.

Is there a way to improve memory management by R, so that it only
reads in the data when actually needed for processing, e.g., only read
in the bands I need, or conversely read in spectra on a per-pixel
basis?

Guy



---
Eric B. Peterson, Ph.D.
Vegetation Ecologist / Data Manager
California Native Plant Society
(916) 322-2926 (desk)
(775) 750-4628 (cell)

_______________________________________________
R-sig-Geo mailing list
R-sig-Geo@stat.math.ethz.ch
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Reply via email to