***  For details on how to be removed from this list visit the  ***
***          CCP4 home page http://www.ccp4.ac.uk         ***


>> Yes, but given an expectation value and a variance, Maximum Entropy
>> methods
>> say that the only unbiased error distribution is Gaussian.
>
> That is really a different (and for the purposes here) unrelated point.

I don't think so.

> You
> are also using a non-traditional definition of 'statistical bias'. The
> Gauss-Markov theorem guarantees that if condition (1) above holds [even if
> (2) and (3) are violated], then the least-squares solution is unbiased.
> Here
> I am using the standard def of 'bias' in statistics:

I'm using the same definition - more or less ;o) Only from a Bayesian
viewpoint, what corresponds to a 'point estimate' would be the expectation
value of the posterior probability for which you, of course, need the
distribution.

If you integrate over a different distribution, you'll often get a
different estimate. If you only _use_ the mean and the variance, you are
in effect committing yourself to a normal distribution.

Ok. We have to separate two issues. One is the set of assumptions laid out
by a traditional LS derivation and the other is the 'biased view' that
Bayesian Theory is the only reasonable scientific inference machine. You
are right that LS does not explicitly make any assumptions about the error
distribution, however if you analyse standard ad hoc traditional
techniques from the Information Theory and Bayesian point of view (which
can be done with some care) you often reveal the real hidden underlying
assumptions on which you base your inference.

Mean on average
Richard




Reply via email to