Hi list,
I've been looking around trying to solve this problem; unfortunately I
haven't found a way to solve it.
I have a database with sampling points with an associated sampling date
and would like to plot the density of sampling points on a monthly basis
for one year using spplot. However,
Hi List,
I am trying to accumulate gridcells to a specific value. This means I
need to extent the nearest neighbours (or search radius) to a size so
that the sum of gridcell values reaches the desired value (or stops if
sum value of gridcells/radius goes beyond a specific size). It should
start
I don't know if this will exactly apply to your case, but you mention pixel by
pixel analysis. If that is really all you need and there are no
texture/neighborhood analyses going on, then you could simply open the file and
use readBin to grab an appropriately sized chunk, analyze it, writeBin t
Dear All,
I have a time series SPOT VGT data of Africa. I wanted to know, does anyone
know simplest algorithm of cloud masking and declouding in ENVI 4.2 OR ERDAS
9.1 software, with easy steps.
Waiting to hear
Best Wishes
Suraj
Zimbabwe
[[alternative HTML version deleted]]
_
Thank you for the information- I'll try that approach out. The new
version of ENVI (4.5) is optimized to work with ArcGIS, so there may
be a way to pass spectra from ENVI to ArcGIS and then on to R if I
can't set up a direct ENVI/IDL to R connection.
I agree with you that R isn't for image proces
On Wed, 30 Jul 2008, Morten Sickel wrote:
I have a lot of measurements located using GPS projected to UTM. I would
prefer to grid those data, i.e. define a grid with e.g. 200m cell size
and calculate the average for each cell. Each cell may contain one, zero
or many measurements. Another relat
I have a lot of measurements located using GPS projected to UTM. I would prefer
to grid those data, i.e. define a grid with e.g. 200m cell size and calculate
the average for each cell. Each cell may contain one, zero or many
measurements. Another related thing I would like to do is to count for
On Tue, 29 Jul 2008, Guy Serbin wrote:
My machine currently has 4 GB on it, but a lot of that's getting eaten
by video memory and the other programs I have in memory. Also, some
of my image cubes are 12 GB in size, so I'd need to find a workaround
anyways. However, since what my colleagues and
R is not intended for the analysis of large hyperspectral images.
If you are interested on spectra of selected pixels, just
import the envi spectra, I recall they are stored as if
they were envi images: a raw file + a text header file in which
you have all the info you need to import into R provid