Re: [ai-geostats] Interpolation of climatic data thorugh space and time.

2005-03-04 Thread Mahdi Osman
As an addition to my previous messge.


VarioWin, Vesper etc are easy tools for variography. Variowin offers a
simple interactive variogram modelling interface. Vesper, developed by
Australian centre of precision agriclture is very interesting, it is based
on ml iteration methods. Try for more information. I have been doing
variography using ARCMAM (8.3). It was not interactive at all, but plenty of
colours, mate.

Please check also R packages such as sgeostat, geoR, MASS, etc

Cheers

Mahdi

>
 
> Dear all,
> 
> Please bear with me on this. A first submission to the list from a
> perplexed and increasingly stressed research assistant (I'm sure you've
> all been there once upon a time, or here)! My queries relate principally
> to the comparison of interpolation methods.
> 
> I have a dataset of 25 locations across the UK of empirically derived
> values based on cloud cover. Each station has a varying length of record
> between 10 years and 42 years between the years 1952 - 2000. 
> 
> To test which was the most appropriate technique to use for
> interpolation between locations for the mean value for each site, I have
> tested a variety of functions available in ArcGIS Geostatistical Analyst
> (version 8.3) including Inverse Distance Weighting, the five radial
> basis functions (completely regularised spline, spline with tension,
> thin plate spline, multiquadratic, inverse multiquadratic) and ordinary
> kriging (spherical semivariogram, no nugget, search neighbourhood
> equalling the range of the variogram). Because of the limited number of
> locations for which data was available, I have used cross-validation to
> generate RMSE, MAE, MSE and G-measures for each interpolation method. My
> first question is: Is the surface with the lowest resulting error
> measures, be they RMSE, MAE or MSE necessarily a feasible way to select
> the best interpolation method? If so the Inverse Multiqudratic function
> appears to yield the best surface.
> 
> Secondly, since data is available on a year-to-year basis, I'd like to
> be able to analyse the variability between years. The problem is that
> data isn't necessarily available for each year for each site! As a
> result the 'best' interpolation method (as measured by RMSE at least)
> varies between years. Nice. By ranking the methods for each year and
> summing the ranks for each interpolation method, it seems that overall
> the Inverse Multiquadratic function marginally outperforms the spline
> with tension. 
> 
> If you've got this far then thanks for reading, and if anyone can
> suggest any tips on where I might go from here with my analysis (or
> where I need to go back to!) I'd be very happy to hear from you.
> 
> Regards,
> 
> Dave Miller
> 
> ~~~
> Dave Miller
> Research Assistant
> GIS & Remote Sensing
> The Macaulay Institute 
> Craigiebuckler 
> Aberdeen
> AB15 8QH
> 
> tel: +44 (0) 1224 498200 (switchboard) ext. 2261
> fax +44 (0) 1224 311556
> e-mail: [EMAIL PROTECTED]
> websites: http://www.macaulay.ac.uk
> http://www.macaulay.ac.uk/LADSS/ladss.shtml
> 
> 

-- 
---
Mahdi Osman (PhD)
E-mail: [EMAIL PROTECTED]
---

DSL Komplett von GMX +++ Supergünstig und stressfrei einsteigen!
AKTION "Kein Einrichtungspreis" nutzen: http://www.gmx.net/de/go/dsl

* By using the ai-geostats mailing list you agree to follow its rules 
( see http://www.ai-geostats.org/help_ai-geostats.htm )

* To unsubscribe to ai-geostats, send the following in the subject or in the 
body (plain text format) of an email message to [EMAIL PROTECTED]

Signoff ai-geostats

[ai-geostats] Kriging thermal data...

2005-03-04 Thread Ranjan S. Muttiah








Typically, thermal bands in remote sensors have coarse resolution.

I’m working with ETM+ thermal 60m data for an urban area. I would like

to get fine resolution estimates from the coarse data.  Perhaps,

by using spatial continuity info. from the coarse scale.  This

work has probably been done before, and would appreciate

help/references…

 






* By using the ai-geostats mailing list you agree to follow its rules 
( see http://www.ai-geostats.org/help_ai-geostats.htm )

* To unsubscribe to ai-geostats, send the following in the subject or in the 
body (plain text format) of an email message to [EMAIL PROTECTED]

Signoff ai-geostats

[ai-geostats] Re: question about kriging with skewed distribution

2005-03-04 Thread Isobel Clark
Marek

Although theoretically non-point support has no reason
to be lognormal, in practice it very often is. We have
had good results in estimating areas and volumes,
although we have limited experience with non-point
support of any significance.

You can test the persistency of lognormality by
aggregating your (point?) sample values into larger
units or by simulation. 

If you seek something theoretically sound you could
use a model based on your 'point' samples to simulate
aggregates and investigate the distributions which
result.

Noel Cressie has done quite a bit with aggregated
values. I do not have references to hand but I am sure
a search would turn up some interesting stuff.

Isobel
http://geoecosse.bizland.com/whatsnew.htm



* By using the ai-geostats mailing list you agree to follow its rules 
( see http://www.ai-geostats.org/help_ai-geostats.htm )

* To unsubscribe to ai-geostats, send the following in the subject or in the 
body (plain text format) of an email message to [EMAIL PROTECTED]

Signoff ai-geostats

[ai-geostats] question about kriging with skewed distribution

2005-03-04 Thread Ing. Marek Brabec PhD
hello,
I have a question about what is/should be typically done when kriging is 
used for spatial interpolation of a process X(z) where z gives spatial 
location (e.g. z=(x,y) with cartesian coordinates x,y) and X(z) has a 
skewed continuous distribution with nonnegative support.  For instance 
lognormal.
Now,
if all data are in the form of point samples, X(z)'s can obviously be 
transformed  by taking  logs  to Y(z)=log(X(z)) which are exactly (with 
lognormal X's) or approximately Gaussian, so that kriging can be done 
comfortably (and the result backtransformed with easy correction for the 
fact that E f(X) is generally not equal to f(E X), based on the formula 
for lognormal expected value or Taylor expansion).
If at least some data are not point samples, but correspond to  the  
regional averages, then problem  occurs  due to the facts that: i) sum 
of lognormals is not lognormal, ii)  the log  of the sum (or average)  
of lognormals  is  not normal.
Obviously, one can do:
i) the kriging on logs anyway with some hand-waving (effectively 
replacing sums by products based on delta method),
ii) or one can (quite inefficiently) work with original data without log 
transformation and argue that  at least  method of moments estimators 
are invoked (with proper weighting),
iii)or one can use some kind of Monte Carlo computationally-intensive 
approach to  compute  likelihood  (or posterior)  based  on  sums of 
lognormals.
At this point, I am not interested in either of the three. My question 
is whether people used some other parametric family (it cannot be 
lognormal) of marginal distributions with positive support, positive 
skew, that is closed under convolution (or under taking weighted 
averages, to be more general) - so that the regional averages and point 
values will have distribution of the same type, differing only in 
parameters (just like in normal case and real support case). One 
possibility would be gamma, what about others?
Thanks in advance for any suggestions.
Best Regards
Ing. Marek Brabec, PhD


* By using the ai-geostats mailing list you agree to follow its rules 
( see http://www.ai-geostats.org/help_ai-geostats.htm )

* To unsubscribe to ai-geostats, send the following in the subject or in the 
body (plain text format) of an email message to [EMAIL PROTECTED]

Signoff ai-geostats