Hi, I am having some difficulties with the interpretation of particular modes of 
convergence. I was wondering if someone could clarify what convergence in distribution 
and in probability really mean in English (if this is possible). 

 For example, with convergence in distribution I understand that although we say a 
sequence of random variables X1, ..., Xn converges in distribution to another random 
variable X (with cdf F(x)), it is really the cdfs F1, ..., Fn that converge to F(x). 
In the case of the arithmetic mean, for example, a sequence of random variables would 
consist of x-bar for sample sizes i=1,...,n. Each of these x-bars has a sampling 
distribution and hence a pdf and cdf. Does convergence in distribution then mean that 
as n increases towards infinity, x-bar will have of course a sampling distribution 
with pdf and cdf, and the cdf will closely approximate that of another random 
variable? I know that this is probably a stupid question that is common sensical to 
many people, but I am having some difficulties with the concept. 

  Secondly, I would like to understand the English analogue (if it exists) of 
convergence in probability. For example, if X-bar converges in probability to mu as n 
approaches infinity then does this mean that although x-bar may not be close to mu for 
a particular sample, that over all possible samples it will be close to mu?

  Thank you for your help. 

  




=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at
                  http://jse.stat.ncsu.edu/
=================================================================

Reply via email to