In article <9v89fi$7e7$[EMAIL PROTECTED]>, Michael London <[EMAIL PROTECTED]> wrote: >An introduction to mathematical statistics by Bain and Engelhart deals with >this topic
>ML >"Clay S. Turner" <[EMAIL PROTECTED]> wrote in message >[EMAIL PROTECTED]">news:[EMAIL PROTECTED]... >> You have probably thought of this, but the age old standard is the Chi >> Square test. For testing goodness of fit to a distribution, the chi-squared test has very low power. If there are parameters to be estimated, the asymptotic distribution is not chi-squared, but can be calculated numerically fairly easily by using the moment generating function as an analytic function. >> One thing about empirical distributions is that they may not be one of >> the standard forms. This is why the Jackknife method and then later the >> Bootstrapping methods were developed. Thus you can extract the >> distribution for your data set. Tests like the chi-squared test, the Kolmogorov-Smirnov test, the Cramer-von Mises test, can test fit to a parametric family, or whether two samples come from the same distribution. If the forms are not "standard", one can compare the deviation with samples form the assumed distribution; this is far better than anything like the jackknife or the bootstrap, and does not require asymptotics to be used. There is no way that anyone can extract the analytic form of a distribution from which data has been sampled from a finite amount of data. -- This address is for information only. I do not claim that these views are those of the Statistics Department or of Purdue University. Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399 [EMAIL PROTECTED] Phone: (765)494-6054 FAX: (765)494-0558 ================================================================= Instructions for joining and leaving this list and remarks about the problem of INAPPROPRIATE MESSAGES are available at http://jse.stat.ncsu.edu/ =================================================================