In article <[EMAIL PROTECTED]>,
Bob Hayden <[EMAIL PROTECTED]> wrote:



>Least squares methods are in some sense optimal when the "errors"
>estimated by the residuals are normally distributed.  They are
>questionable when the errors are multimodal, strongly skewed, or
>afflicted with outliers.

Least squares is not optimal without such conditions.  It is 
valid under much weaker assumptions; the Gauss-Markov theorem
does not care if the errors are multimodal or strongly skewed.
In such cases, so-called robust procedures like least absolute
value are likely to be invalid.  If the dependent variable is
linear in the "independent variables" (not necessarily functionally
independent) of the model linear in the parameters, and the errors
are uncorrelated with the independent variables, least squares
is valid; with more assumptions, one might do better.

As for outliers, the appropriate meaning for them is that they
are observations which are incorrect, or for which the assumptions
of the model are invalid.  Those should be removed, as should
any others of that type.  
-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558


===========================================================================
This list is open to everyone.  Occasionally, less thoughtful
people send inappropriate messages.  Please DO NOT COMPLAIN TO
THE POSTMASTER about these messages because the postmaster has no
way of controlling them, and excessive complaints will result in
termination of the list.

For information about this list, including information about the
problem of inappropriate messages and information about how to
unsubscribe, please see the web page at
http://jse.stat.ncsu.edu/
===========================================================================

Reply via email to