In sci.stat.consult Chia C Chong <[EMAIL PROTECTED]> wrote:
: down!!) fit very accurately to the data. The only bit that is not fit is the
: height of the estimated gamma PDF is not high enough. Does this means that
ARe the areas thee same?
===
On Fri, 11 Jan 2002, Dennis Roberts wrote:
>
> finally ... i think we are making a mountain out of a molehill in this ...
> to me ... the most important "fact" from the video was that (regardless of
> change and how you define it) ... whites approved of the president to a FAR
> greater exte
On Fri, 11 Jan 2002, Dennis Roberts wrote:
> if the polls used similar ns in the samples ... i disagree
>
> now, if the white sample was say 600 and the black sample was 100 ... i
> MIGHT be more likely to agree with the comment below
consider white goes 10% to 15% up 50%, 5%pts
EugeneGall <[EMAIL PROTECTED]> wrote:
: The Gallup organization posted a video to explain why the the increase in
: black's job approval for Bush is 'proportionate' to the increase among whites.
It makes no sense to talk of "proportionate" increases in percentages
Suppose you start at zero or 9
In sci.stat.consult janne <[EMAIL PROTECTED]> wrote:
: I have a correlation formula I don't get to work. And we must use this
: formula on the test. Let me give you an example: Let's say X and Y
If you don't know with x(with a line above) MEANS, you need to STUDY your
text. Also your instructo
Bruce Weaver <[EMAIL PROTECTED]> wrote:
: Paul's post reminded me of something I read in Keppel's Design and
: Analysis. Here's an excerpt from my notes on ANCOVA:
: the analysis of covariance is more precise with correlations greater than
: .6. Since we rarely obtain correlations of this la
Paul R. Swank <[EMAIL PROTECTED]> wrote:
: Some years ago I did a simulation on the pretest-posttest control group
: design lokking at three methods of analysis, ANCOVA, repeated measures
: ANOVA, and treatment by block factorial ANOVA (blocking on the pretest using
: a median split).
I found tha
Morelli Paolo <[EMAIL PROTECTED]> wrote:
: HI all,
: I have to analyse some clinical data. In particular the analysis is a
: comparison between two groups of the mean change baseline to endpoint of a
: score.
i hope that your study is randomized; if not it's not worth worrying
about. If so his
In sci.stat.consult Doug Federman <[EMAIL PROTECTED]> wrote:
: Unfortunately, the preceptors rarely agree on the grades. One preceptor
: is biased towards the middle of the 1-9 likert scale and another may be
: biased towards the upper end. Rarely, does a given preceptor use the 1-9
: range c
Donald Burrill <[EMAIL PROTECTED]> wrote:
: "The story is about six students who ... The instructor ... tells them
: "The one question was, "Which tire?" I remember that the likelihood of
: all four pickng the same tire was quite small, but I forgot how to
: calculate it explicitly."
: Ass
Peter L?vgreen <[EMAIL PROTECTED]> wrote:
: important, when comparing the observed distribution to the expected
: distribution.
the expected values are obtained from the marginals and there are a+b-1 of
them and ab independent table entries. There are (a-1)(b-1) numbers free
to vary (the differe
Sima <[EMAIL PROTECTED]> wrote:
: Dear List Members,
: I have missed some lectures on statistics due to heavy illness
: and now i got an assignment which i cannot solve.
We all feel sorry for you Sima, but perhaps you should talk to your
instructor about it. He undoubtedly has office hours.
Dennis Roberts <[EMAIL PROTECTED]> wrote:
: given a simple effect size calculation ... some mean difference compared to
: that is ... can we not get both NS or sig results ... when calculated
: effect sizes are small, medium, or large?
: if that is true ... then what benefit is there to look a
Ivan Balducci <[EMAIL PROTECTED]> wrote:
: Magnetic resonance versusFacial Pain Total
:Yes No
: Yes 14 5
: No 11 3041
In sci.stat.consult ASGHAR AKBARI <[EMAIL PROTECTED]> wrote:
: I nedd to Weighted Gram_Schmidt orthogonalization for my work in
: credibility.
In weighted least squares with diagonal weight matrix D you orthogonalize
X'D^1/2 which probably amounts to what you are doing
===
In sci.stat.consult Atul <[EMAIL PROTECTED]> wrote:
: I have a doubt regarding adjusted r-square
: How do we calculate the adjusted r-square when the error degrees of
: freedom are zero ?
You don't. you will have perfect prediction even for random numbers.
=
Rich Ulrich <[EMAIL PROTECTED]> wrote:
: I can believe someone would call it that, but I can believe other
: tests would be called that, too. You state this is, in reference
: to what test (what computer program, or what textbook)?
see my other post
: MANOVA is a canonical correlation problem.
re previous discussion
My old computer program MANOVA has a built in test of parallelism in
multivariate ANCOVA. It's really standard multivariate regression theory
although it isn't widely known. (TW Anderson gave MANOVA and CanR as two
different eigenproblems).
They are easily shown to be e
While cleaning my office I found a 1973 paper by Golub and Styan which
says
"the matrix X'X is greatly influenced by roundoff errors and is often
ill-conditioned ... An excellent way of solving (the LS equations) is
through an orthogonal triangular decomposition of X.
At a training session, (w
Gardburyb <[EMAIL PROTECTED]> wrote:
: Hi all,
: I'm new to the group. I'm doing my dissertation, and I am doing a canonical
: correlation analysis. My question is, what is the best way to compare canonical
The test of parallelism in mancova is an equivalent test
=
In sci.stat.consult Gordon D. Pusch <[EMAIL PROTECTED]> wrote:
: Don't do it that way either --- it's notoriously ill-conditioned.
: It's better and more numerically stable to use the singular-value
: decomposition of 'A' to solve this problem.
It's NOT ill-conditioned unless the X'X matrix is
In sci.stat.consult Michael Stembera <[EMAIL PROTECTED]> wrote:
: the least squares solution is
: AT * A * W = AT * Y (T means Transpose here)
: W = (AT * A)^-1 * AT * Y
yes, but don't do it that way; solve the simultaneous equations
A'Aw = A'y
In sci.stat.consult Greg Heath <[EMAIL PROTECTED]> wrote:
: On Sat, 4 Aug 2001, Michael Stembera wrote:
: Augmenting Xi with xi(n+1) = 1 and w with
: w(n+1), yields the linear matrix equation
when you do this it looks like least squares to me
minimizing
(y-Xb)'(y-Xb) where b are your unkno
in general for numbers a to b
subtract a and divide by (b-a)
=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE MESSAGES are available at
http://jse.stat.ncsu.edu/
Alexander Tsyplakov <[EMAIL PROTECTED]> wrote:
: No, regression models can have stochastic and/or discrete
: regressors. I can only agree that regression models have
There are no constraints whatsoever on the x variables for the
significance tests and estimates to be valid. Power is another mat
Dale Berger <[EMAIL PROTECTED]> wrote:
: Dear Colleagues,
: A student is evaluating a summer program for junior high students. One of
: the goals was to raise 'self esteem.' Measures were taken before the
there is no good answer. You might look at "problems in measuring
change" edited by ches
Werner Wittmann <[EMAIL PROTECTED]> wrote:
: inverting the
: correlation matrix to get the effects was too complicated to compute by
: hand,so Sir Ronald developed the ANOVA shortcut.
hardly. They do have some mathematics in common (through use of dummy
variables which some of us think is for d
J. Williams <[EMAIL PROTECTED]> wrote:
Would this not
: be the same as the offspring of either the very tall or the very short
: among us moving toward an arithmetic average? Is it inconceivable
: that a pair of dullards could produce a Beethoven or a Fermi for
: example? Frankly, I believe old
On Wed, 17 Jan 2001, Bob Wheeler wrote:
> I've heard this before -- probably read it in stat
> books. It isn't true. Galton worried over the
> problem until he understood the statistical
> mechanism.
you may abe right; that's why I said apparrently
===
On Wed, 17 Jan 2001, dennis roberts wrote:
> ... if the instructor is going to base my test grade on my percentile rank
> ... i could easily get a lower grade on test 2 than test 1 ...
>
> somehow, i think this is penalizing me for the fact that there is not and
> cannot be a perfect correla
There seems to be some confusion about what regression to the mean
means. Noone is penalized (or advantaged) because of regression to the
mean. You ALWAYS have RTM in a population whether everyone improves or
gets worse. It is a property of standardized scores only for a
population. The simpl
In sci.stat.edu Rich Ulrich <[EMAIL PROTECTED]> wrote:
: If it is some other data... When you have multiple replications,
: sometimes you don't want the *mean* -- for "best single performance"
: you might select maximum or minimum. Or you might consider a trimmed
or in a situation where you ex
In sci.stat.edu Jim Kroger <[EMAIL PROTECTED]> wrote:
: Hello, I've received some expert help here on a couple previous occasions
: (thanks). I have an issue bothering me, which I'd like to present to you.
: I'm doing a two-way, 2X2 ANOVA. Suppose I have 20 subjects, and each has
: 25 observation
In sci.stat.consult Clark Dickin <[EMAIL PROTECTED]> wrote:
: I have a significant main effect for both of my DV's and also
: have a significant Interaction among the DV's but I am unable to
: determine where the interaction comes from. More specifically, is there
the multivariate test assures y
Jeff E. Houlahan <[EMAIL PROTECTED]> wrote:
: Is it ever appropriate to do a 2-factor unreplicated ANOVA with
: empty cells if you aren't sure there is no interaction between the
^
you can test the part of the interaction that is testable, but of course
you can never know about the rest.
G?khan <[EMAIL PROTECTED]> wrote:
: Hi!
: I wonder how the public is evaluating the normal distribution function
I presume that you want the density of a multivar normal distrib. You
don't calculate the inverse; you just need the quadratic form. I think
that Searle's matrix algebra book gives
on the other hand, it is not desirable at all; it is dumb. You would have
no evidence as to the linearity of the regression function
=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROPRIATE M
In sci.stat.consult Dr. Hans-Christian Waldmann <[EMAIL PROTECTED]>
wrote:
: Hello everybody,
you havn't really given enough information but here is a suggestion. you
have three separate groups. If they are not the treatment groups with
random assignment, anything else you do will be VERY dubi
Wuensch, Karl L. <[EMAIL PROTECTED]> wrote:
: I sure many of you have been asked a question like that posed today
: by one of my students, and I would be interested in hearing how you respond
Sounds good to me, but I think they may not be giving you enough to do
over there at ECU.
==
EAKIN MARK E <[EMAIL PROTECTED]> wrote:
: Does anyone have a reference for regression model-building with sparse
: data?
ME: Don't do it.
=
Instructions for joining and leaving this list and remarks about
the problem of INAPPROP
Simon, Steve, PhD <[EMAIL PROTECTED]> wrote:
: The instructor claims that the part correlation is usually better (more
: interpretable?) but that SPSS and other software will not compute such a
: correlation.
: Does any of this make sense? Why would you ever want to use a part
: correlation?
She
Marco Antonio Chamon <[EMAIL PROTECTED]> wrote:
: Hi,
: Everyone has a program (in excel) to performe multiple linear
: regression? If possible, I'd appreciate to have a copy. Thank you in
: advance.
It's an option in Chart under linear trend
==
Rich Ulrich <[EMAIL PROTECTED]> wrote:
: I think you have been exposed to the prejudices of a few
: Experimentalists in psychology and education, who were
: overly-impressed--for a little while, about 30 or 40 years ago--with
: the miracles of "nonparametric analyses which don't need any
: assump
[EMAIL PROTECTED] wrote:
: I am told that I can solve for these three unknowns (B1, B2 and B3) by
: doing simple linear regression to obtain "residuals"; from the
: residuals come the unknowns. For example, I know that with just two
: unknowns (B1 and B2) in:
why would you want to?
===
In sci.stat.consult sofyan2000 <[EMAIL PROTECTED]> wrote:
: I have conducted a repeated measure mixed two-factor ANOVA on one sample
you shouldn't have
: 1. What statistical ANOVA test can reveal an outlier in my data?
none
: 2. If my test failed the 'homogeneity of variance/ covariance' test,
Grover Proctor <[EMAIL PROTECTED]> wrote:
: In scoring the Semantic Differential, does one treat MISSING data (i.e., a
: scale to which a subject failed to give a response) as "null" or do you
: assign it the "middle value" of your scale (i.e, 4 on a 1-to-7 scale, or 0
: in a -3-to-+3 scale)?
so
Jan de Leeuw <[EMAIL PROTECTED]> wrote:
: I am glad this one is back after a short absence. Likert scales (or
: any other data) "are" not ordinal or interval. Actually, they have no
a good defense is that anova and regression are not very sensitive to
moderate rescaling of a five point scale so l
I'm sure they print it out along with the correlation matrix
In sci.stat.consult haytham siala <[EMAIL PROTECTED]> wrote:
: Can someone please tell me how to calculate the SMC (Squared Multiple
: Correlation) in a factor analysis (SPSS)? I am not sure but could it be the
: diagonal of a factor t
In sci.stat.consult Jason Osborne, Ph.D. <[EMAIL PROTECTED]> wrote:
: I am testing for partial mediation. I need to know whether the
: unstandardized regression coefficient for variable X predicting Y in one
: regression equation is significantly different from the unstandardized
: regression co
ELN/fisackson <[EMAIL PROTECTED]> wrote:
: calculate a correlation coefficient bewteen two such variables, of if the
: old Pearson correlation coefficient does not assume inerrancy in one
: variable and is thus a sound measure. ID the Pearson moment is
the pearson correlation is perfectly soun
I don't understand why you people are making this so complicated; All he
needs to do is draw a Venn diagram
In sci.stat.consult Lloyd I. Richardson <[EMAIL PROTECTED]> wrote:
: of dummy variables for categorical variable model testing? Maybe it was
: Kelly, Beggs & McNeil that first suggested this technique.
actually it was R.A. Fisher thru use of orthogonal polynomials
In sci.stat.consult "Rui Jorge Gonalves" <[EMAIL PROTECTED]> wrote:
...
this is the same as eta squared which I mentioned before
sounds as though you analyzed it wrong. did you treat it as a factorial
design?
: In article <[EMAIL PROTECTED]>,
: Christopher <[EMAIL PROTECTED]> wrote:
:>But how to measure, non-linear relation for 2 continuous variables.
you could use eta squared which in ANOVA terms is
1 - SSe/SSt
It is not symmetric in x and y and will equal 1 with no variability.
55 matches
Mail list logo