Re: multivariate normal with zero covariances?

2001-04-16 Thread Charles Metz

Henry wrote:
 
 > On Fri, 13 Apr 2001 15:50:55 -0500, Charles Metz <[EMAIL PROTECTED]>
 > wrote:
 > > This follows directly
 > >from the fact that uncorrelated *normal* random variables
 > >are independent (which can be proven by examining the form
 > >of the general multivariate normal density function when
 > its covariance matrix is diagonal).
 > 
 > You may need to be more careful with your language here as
 > "uncorrelated *normal* random variables" do not always need
 > to come from a multivariate normal or be independent, though
 > it is true that if they do then they are.

I apologize for my carelessness.  Henry is correct, of course.

  Charles Metz


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



testing HWE in case-control studies

2001-04-16 Thread James Cui

It is said that Marc Lathrop proposed a method for case-control studies,
to test the Hardy-Weinberg Equilibrium (HWE) in cases given control are
under HWE. But I cannot find the appropriate reference. Could anyone help
with this reference or any other reference you know.

Thanks,
James.





=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Simple ? on standardized regression coeff.

2001-04-16 Thread d.u.

Hi everyone. In the case of standardized regression coefficients (beta),
do they have a range that's like a correlation coefficient's? In other
words, must they be within (-1,+1)? And why if they do? Thanks!




=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



adaptive covariance

2001-04-16 Thread James Lennox

Hi,

Can anybody tell me why the eigenvalues of an adaptively updated
covariance matrix are not only shrunk compared to those of the data, but
are spread out? I can see that the filtering processes invovled perhaps
tend to smooth out the non-correlations between variables and hence
reinforce the correlations, but have no idea how to prove this. Update
occurs every nk samples where nk>= number of variables.

Mean update:
mx(k+nk) = mu*mx(k) + (1-mu)*sum(x(k+1:k+nk))/nk
Adaptively scaled data window:
xw = x(k+1:k+nk) - mx(k+nk)
Change in mean:
dmx(k+nk) = mx(k+nk)-mx(k)
Adaptive covariance update:
R(k+nk) = mu*( R(k) + dmx(k+nk)'*dmx(k+nk) ) + (1-mu)*xw'*xw/nk

The eigenvalues of R(k) I thought could be related to those of cov(X)
(assuming now that X is stationary) by a fairly messy equation. When I
try the above on cross-correlated random variables x, the eigenvalues
seem related only on average. The largest eval of Rk is significantly
larger than expected and the smallest, significantly smaller than
expected.

Thanks in advance for any suggestions,
James



--
Mr James Lennox
AWMC, Dept Chemical Engineering
The University of Queensland
St Lucia
QLD 4072
AUSTRALIA

[EMAIL PROTECTED]
Ph: +61 7 33469051
Fax: +61 7 33654726




=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Student's t vs. z tests

2001-04-16 Thread Eric Bohlman

Mark W. Humphries <[EMAIL PROTECTED]> wrote:
> Hi,

> I am attempting to self-study basic multivariate statistics using Kachigan's
> "Statistical Analysis" (which I find excellent btw).

> Perhaps someone would be kind enough to clarify a point for me:

> If I understand correctly the t test, since it takes into account degrees of
> freedom, is applicable whatever the sample size might be, and has no
> drawbacks that I could find compared to the z test. Have I misunderstood
> something?

You're running into a historical artifact: in pre-computer days, using the 
normal distribution rather than the t distribution reduced the size of the 
tables you had to work with.  Nowadays, a computer can compute a t 
probability just as easily as a z probability, so unless you're in the 
rare situation Karl mentioned, there's no reason not to use a t test.



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Correlation of complex data

2001-04-16 Thread Peter J. Wahle

Thank you very much for the response, but I'm not sure of the details.  This
is not my area, and have not had much luck finding the information at the
library.

r = [n*sum(xi*yi) - sum(xi)*sum(yi)] / sqrt{[n*sum(xi^2) -
sum(xi)^2][n*sum(yi^2) - sum(yi)^2]}

The conjugates would be used on just the square terms, or all the product
terms?  What is the 1_2 inner product?  How should the complex result be
interpreted--what exactly is the meaning?


| >Can the linear correlation coefficient equation (Pearson's) be used for
| >calculating the correlation coefficient of complex variables by
substituting
| >complex math?  If so, the resultant will be a complex variable--what will
be
| >the meanings of the components?
|
| It does have meaning, if one is careful to use conjugates
| in the right places.  It is the l_2 inner product, divided
| by the product of the norms.





=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



RE: Student's t vs. z tests

2001-04-16 Thread Wuensch, Karl L.

If you knew the population SD (not likely if you are estimating the
population mean), you would have more power with the z statistic (which
requires that you know the population SD rather than estimating it from the
sample) than with t.
 -Original Message-
If I understand correctly the t test, since it takes into account degrees of
freedom, is applicable whatever the sample size might be, and has no
drawbacks that I could find compared to the z test. Have I misunderstood
something?
 Mark



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Re: Correlation of complex data

2001-04-16 Thread Herman Rubin

In article ,
Peter J. Wahle <[EMAIL PROTECTED]> wrote:
>Can the linear correlation coefficient equation (Pearson's) be used for
>calculating the correlation coefficient of complex variables by substituting
>complex math?  If so, the resultant will be a complex variable--what will be
>the meanings of the components?

It does have meaning, if one is careful to use conjugates
in the right places.  It is the l_2 inner product, divided
by the product of the norms.


-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
[EMAIL PROTECTED] Phone: (765)494-6054   FAX: (765)494-0558


=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=



Student's t vs. z tests

2001-04-16 Thread Mark W. Humphries

Hi,

I am attempting to self-study basic multivariate statistics using Kachigan's
"Statistical Analysis" (which I find excellent btw).

Perhaps someone would be kind enough to clarify a point for me:

If I understand correctly the t test, since it takes into account degrees of
freedom, is applicable whatever the sample size might be, and has no
drawbacks that I could find compared to the z test. Have I misunderstood
something?

Thanks in advance for your help.

Cheers,
 Mark



=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
  http://jse.stat.ncsu.edu/
=