I have a mixed balanced ANOVA design with a between-subject factor
(Grp) and a within-subject factor (Rsp). When I tried the following
two commands which I thought are equivalent,
fit.lme - lme(Beta ~ Grp*Rsp, random = ~1|Subj, Model);
fit.aov - aov(Beta ~ Rsp*Grp+Error(Subj/Rsp)+Grp,
Gang Chen wrote:
I have a mixed balanced ANOVA design with a between-subject factor
(Grp) and a within-subject factor (Rsp). When I tried the following
two commands which I thought are equivalent,
fit.lme - lme(Beta ~ Grp*Rsp, random = ~1|Subj, Model);
fit.aov - aov(Beta ~
Thanks for the response!
It is indeed a balanced design. The results are different in the
sense all the F tests for main effects are not the same. Do you mean
that a random interaction is modeled in the aov command? If so, what
would be an equivalent command of aov to the one with lme?
people, btw, and I'm happy to stand corrected)
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Gang Chen
Sent: Friday, August 03, 2007 4:01 PM
To: Peter Dalgaard
Cc: r-help@stat.math.ethz.ch
Subject: Re: [R] lme and aov
Thanks for the response
Gang Chen wrote:
Thanks a lot for clarification! I just started to learn programming in
R for a week, and wanted to try a simple mixed design of balanced
ANOVA with a between-subject factor
(Grp) and a within-subject factor (Rsp), but I'm not sure whether I'm
modeling the data correctly
corrected)
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Gang Chen
Sent: Friday, August 03, 2007 4:01 PM
To: Peter Dalgaard
Cc: r-help@stat.math.ethz.ch
Subject: Re: [R] lme and aov
Thanks for the response!
It is indeed a balanced design. The results
This looks odd. It is a standard split-plot layout, right? 3
groups of 13 subjects, each measured with two kinds of Rsp = 3x13x2
= 78 observations.
Yes, that is right.
In that case you shouldn't see the same effect allocated to
multiple error strata. I suspect you forgot to declare
I am trying to understand better an analysis mean RT in various
conditions in a within subjects design with the overall mean RT /
subject as one of the factors. LME seems to be the right way to do
this. using something like m- lme(rt~ a *b *subjectRT, random=
~1|subject) and then anova(m,type
Do you want to make inference about the specific subjects in your
study? If yes, the subjects are a fixed effect. If instead you want to
make inference about the societal processes that will generate the
subjects you will get in the future, that is a random effect. The
function lme
Its not so much that I wasn't getting the difference between fixed and
random effects. Although, I do like the way you put the comment below.
For my purposes subject is a random effect. It was more on correct
notation in lme with repeated measures designs (my a and b are repeated
while the
Hi,
I have a question about using lme and aov for the
following dataset. If I understand correctly, using
aov with an Error term in the formula is equivalent
to using lme with default settings, i.e. both assume
compound symmetry correlation structure. And I have
found that equivalency in the
Hi,
I have a question about using lme and aov for the
following dataset. If I understand correctly, using
aov with Error term in the formula is equivalent to
using lme with default settings, i.e. both assume
compound symmetry correlation structure. And I have
found that equivalency in the past.
I've posted the following to R-help on May 15.
It has reproducible R code for real data -- and a real
(academic, i.e unpaid) consultion background.
I'd be glad for some insight here, mainly not for myself.
In the mean time, we've learned that it is to be expected for
anova(*, marginal) to be
MM == Martin Maechler [EMAIL PROTECTED]
on Tue, 17 Jun 2003 10:13:44 +0200 writes:
MM I've posted the following to R-help on May 15.
MM It has reproducible R code for real data -- and a real
MM (academic, i.e unpaid) consultion background.
MM I'd be glad for some insight
14 matches
Mail list logo