Re: [R] problem with se.contrast()
On Tue, 22 Feb 2005, Christoph Buser wrote: Dear Prof Ripley, Dear Prof Dalgaard Thank you both for your help. I tried it with helmert contrasts and got a result that is consistent with lme. I didn't realize that the parameterization of the model has an influence on the contrasts that I tried to test. It seems that I should read a little bit more about this issue for my better understanding. I have one last point to propose: You could include (as interim solution) a warning (that there might be an in efficiency loss) in se.contrast() if one uses non-orthogonal contrasts and a multi-stratum model. If you meant on the help page, I have already done that. We are still thinking about how to detect the problem well enough, and possibly even to see if we can avoid it. It's trickier than I remembered, although I think I did probably once know. -- Brian D. Ripley, [EMAIL PROTECTED] Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Re: [R] problem with se.contrast()
Dear Prof Ripley, Dear Prof Dalgaard Thank you both for your help. I tried it with helmert contrasts and got a result that is consistent with lme. I didn't realize that the parameterization of the model has an influence on the contrasts that I tried to test. It seems that I should read a little bit more about this issue for my better understanding. I have one last point to propose: You could include (as interim solution) a warning (that there might be an in efficiency loss) in se.contrast() if one uses non-orthogonal contrasts and a multi-stratum model. Best regards, Christoph Buser -- Christoph Buser <[EMAIL PROTECTED]> Seminar fuer Statistik, LEO C11 ETH (Federal Inst. Technology) 8092 Zurich SWITZERLAND phone: x-41-1-632-5414 fax: 632-1228 http://stat.ethz.ch/~buser/ -- Prof Brian Ripley writes: > On Mon, 21 Feb 2005, Peter Dalgaard wrote: > > > Prof Brian Ripley <[EMAIL PROTECTED]> writes: > > > test.aov <- with(testdata,aov(Measurement ~ Material + > Error(Lab/Material))) > se.contrast(test.aov, > > list(Material=="A",Material=="B",Material=="C",Material=="D"), > coef=c(0.5,0.5,-0.5,-0.5),data=testdata) > [1] 0.1432572 > >>> > >>> I got a different result and I have admit that I didn't understand why > >>> there is a differnce between the lme model and this one. There are some > >>> comments in the help pages but I'm not sure if this is the answer. > >> > >> It is. You used the default `contrasts', which are not actually > >> contrasts. With > >> > >> options(contrasts=c("contr.helmert", "contr.poly")) > >> > >> it gives the same answer as the other two. Because you used > >> non-contrasts there was an efficiency loss (to the Intercept stratum). > > > > Brian, > > > > I'm not sure how useful that contrasts-that-are-not-contrasts line is. > > I agree, it was not precise enough (too late at night for me). Try > `non-orthogonal contrasts'. The issue was correct though, it is the > choice of contrasts, and as I would automatically use orthogonal contrasts > with aov() I had not encountered it and it took me a while to pick up on > what Christoph had done differently from me (I had run the example and got > the same result as the randomized-block analysis before my original post). > > There's a comment in the code for aov: > > ## helmert contrasts can be helpful: do we want to force them? > ## this version does for the Error model. > > and perhaps we should make them the default for the per-stratum fits. > > > It certainly depends on your definition of "contrasts". Contrast > > matrices having zero column sums was not part of the definition I was > > taught. I have contrasts as "representations of the group mean > > structure that are invariant to changes of the overall level", so > > treatment contrasts are perfectly good contrasts in my book. > > I don't think Yates thought in those terms, and the whole idea of dividing > into strata (and the generalized Yates algorithm) is based on contrasts > being orthogonal to the overall mean (and to things in other strata). > > It was that, not zero-sum, that I was taught, but in balanced cases such > as here it is the same thing. > > > The zero-sum condition strikes me as a bit arbitrary: after all there > > are perfectly nice orthogonal designs where some levels of a factor > > occur more frequently than others. > > Balance and orthogonality are not quite the same, though. > > > This in turn makes me a bit wary of what is going on inside > > se.contrasts, but it's gotten too late for me to actually study the code > > tonight. > > > > Could you elaborate on where precisely the condition on the contrast > > matrices comes into play? > > In finding the projection lengths in eff.aovlist, here > > proj.len <- > lapply(aovlist, function(x) > { > asgn <- x$assign[x$qr$pivot[1:x$rank]] > sp <- split(seq(along=asgn), attr(terms(x), "term.labels")[asgn]) > sapply(sp, function(x, y) sum(y[x]), y=diag(x$qr$qr)^2) > }) > > using only the diagonal requires orthogonality of the coding. > > It is many years since I looked at this, and it is not immediately clear > to me how best to calculate efficiencies without this assumption (which > could be tested inside eff.aovlist, of course). > > [People wondering why this was ever useful given lme should be aware that > 1) It predates the availability of lme. > 2) When I first used this in 1998, lme appeared very slow. > 3) People with a classical aov training (not me, but e.g. Bill Venables) > think in these terms.] > > -- > Brian D. Ripley, [EMAIL PROTECTED] > Professor of Applied Statistics, http://www.stat
Re: [R] problem with se.contrast()
On Mon, 21 Feb 2005, Peter Dalgaard wrote: Prof Brian Ripley <[EMAIL PROTECTED]> writes: test.aov <- with(testdata,aov(Measurement ~ Material + Error(Lab/Material))) se.contrast(test.aov, list(Material=="A",Material=="B",Material=="C",Material=="D"), coef=c(0.5,0.5,-0.5,-0.5),data=testdata) [1] 0.1432572 I got a different result and I have admit that I didn't understand why there is a differnce between the lme model and this one. There are some comments in the help pages but I'm not sure if this is the answer. It is. You used the default `contrasts', which are not actually contrasts. With options(contrasts=c("contr.helmert", "contr.poly")) it gives the same answer as the other two. Because you used non-contrasts there was an efficiency loss (to the Intercept stratum). Brian, I'm not sure how useful that contrasts-that-are-not-contrasts line is. I agree, it was not precise enough (too late at night for me). Try `non-orthogonal contrasts'. The issue was correct though, it is the choice of contrasts, and as I would automatically use orthogonal contrasts with aov() I had not encountered it and it took me a while to pick up on what Christoph had done differently from me (I had run the example and got the same result as the randomized-block analysis before my original post). There's a comment in the code for aov: ## helmert contrasts can be helpful: do we want to force them? ## this version does for the Error model. and perhaps we should make them the default for the per-stratum fits. It certainly depends on your definition of "contrasts". Contrast matrices having zero column sums was not part of the definition I was taught. I have contrasts as "representations of the group mean structure that are invariant to changes of the overall level", so treatment contrasts are perfectly good contrasts in my book. I don't think Yates thought in those terms, and the whole idea of dividing into strata (and the generalized Yates algorithm) is based on contrasts being orthogonal to the overall mean (and to things in other strata). It was that, not zero-sum, that I was taught, but in balanced cases such as here it is the same thing. The zero-sum condition strikes me as a bit arbitrary: after all there are perfectly nice orthogonal designs where some levels of a factor occur more frequently than others. Balance and orthogonality are not quite the same, though. This in turn makes me a bit wary of what is going on inside se.contrasts, but it's gotten too late for me to actually study the code tonight. Could you elaborate on where precisely the condition on the contrast matrices comes into play? In finding the projection lengths in eff.aovlist, here proj.len <- lapply(aovlist, function(x) { asgn <- x$assign[x$qr$pivot[1:x$rank]] sp <- split(seq(along=asgn), attr(terms(x), "term.labels")[asgn]) sapply(sp, function(x, y) sum(y[x]), y=diag(x$qr$qr)^2) }) using only the diagonal requires orthogonality of the coding. It is many years since I looked at this, and it is not immediately clear to me how best to calculate efficiencies without this assumption (which could be tested inside eff.aovlist, of course). [People wondering why this was ever useful given lme should be aware that 1) It predates the availability of lme. 2) When I first used this in 1998, lme appeared very slow. 3) People with a classical aov training (not me, but e.g. Bill Venables) think in these terms.] -- Brian D. Ripley, [EMAIL PROTECTED] Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Re: [R] problem with se.contrast()
Prof Brian Ripley <[EMAIL PROTECTED]> writes: > >> test.aov <- with(testdata,aov(Measurement ~ Material + > >> Error(Lab/Material))) > >> se.contrast(test.aov, > >> list(Material=="A",Material=="B",Material=="C",Material=="D"), > >> coef=c(0.5,0.5,-0.5,-0.5),data=testdata) > >> [1] 0.1432572 > > > > I got a different result and I have admit that I didn't understand why > > there is a differnce between the lme model and this one. There are some > > comments in the help pages but I'm not sure if this is the answer. > > It is. You used the default `constrasts', which are not actually > contrasts. With > > options(contrasts=c("contr.helmert", "contr.poly")) > > it gives the same answer as the other two. Because you used > non-contrasts there was an efficiency loss (to the Intercept stratum). Brian, I'm not sure how useful that contrasts-that-are-not-contrasts line is. It certainly depends on your definition of "contrasts". Contrast matrices having zero column sums was not part of the definition I was taught. I have contrasts as "representations of the group mean structure that are invariant to changes of the overall level", so treatment contrasts are perfectly good contrasts in my book. The zero-sum condition strikes me as a bit arbitrary: after all there are perfectly nice orthogonal designs where some levels of a factor occur more frequently than others. This in turn makes me a bit wary of what is going on inside se.contrasts, but it's gotten too late for me to actually study the code tonight. Could you elaborate on where precisely the condition on the contrast matrices comes into play? -- O__ Peter Dalgaard Blegdamsvej 3 c/ /'_ --- Dept. of Biostatistics 2200 Cph. N (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~ - ([EMAIL PROTECTED]) FAX: (+45) 35327907 __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Re: [R] problem with se.contrast()
On Thu, 17 Feb 2005, Christoph Buser wrote: Dear Jamie As Prof. Ripley explained your analysis is equivalent to the fixed effect models for the means, so you can calculate it by (if this is your design): Lab <- factor(rep(c("1","2","3"),each=12)) Material <- factor(rep(c("A","B","C","D"),each=3,times=3)) Measurement <- c(12.20,12.28,12.16,15.51,15.02,15.29,18.14,18.08,18.21, 18.54,18.36,18.45,12.59,12.30,12.67,14.98,15.46,15.22, 18.54,18.31,18.60,19.21,18.77,18.69,12.72,12.78,12.66, 15.33,15.19,15.24,18.00,18.15,17.93,18.88,18.12,18.03) testdata <- data.frame(Lab,Material,Measurement) rm(list=c("Lab","Material","Measurement")) You can aggregate your data dat.mean <- aggregate(testdata$Measurement, by = list(Material=testdata$Material,Lab=testdata$Lab), FUN = mean) names(dat.mean)[3] <- "Measurement" test.red.aov1 <- aov(Measurement ~ Lab + Material, data = dat.mean) se.contrast(test.red.aov1, list(Material=="A",Material=="B",Material=="C",Material=="D"), coef=c(0.5,0.5,-0.5,-0.5),dat.mean) [1] 0.1220339 By aggregating the data you bypass the problem in se.contrast and you do not need R-devel. Note that this is not the same contrast, and you need to double the value for comparability with the original problem. - The second way to get the same is to set your contrast for the factor "Material" and calculate you model with this contrast and use summary.lm: dat.mean$Material <- C(dat.mean$Material, c(-0.5,-0.5,0.5,0.5), how.many = 3) test.red.aov2 <- aov(Measurement ~ Lab + Material, data = dat.mean) summary.lm(test.red.aov2) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 16.020000.10568 151.583 5.56e-12 *** Lab2 0.258330.14946 1.7280.135 Lab3 0.065830.14946 0.4400.675 Material14.522780.12203 37.062 2.58e-08 *** Material21.210560.12203 9.920 6.06e-05 *** Material31.553890.12203 12.733 1.44e-05 *** --- Signif. codes: 0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1 Material1 is now the contrast you are interested in and you get beside the Std. Error the Estimate, too. Material2 and Material3 are just orthogonal contrasts to complete. - The third way (which I prefer) is using lme from the package nlme library(nlme) Here I use the origianl data set (not aggregated) and set the desired contrast, too. testdata$Material <- C(testdata$Material, c(-0.5,-0.5,0.5,0.5), how.many = 3) test.lme <- lme(fixed = Measurement ~ Material, data = testdata, random = ~1|Lab/Material) With anova you get the F-Test for the fixed factor "Material" anova(test.lme) numDF denDF F-value p-value (Intercept) 124 43301.14 <.0001 Material3 6 544.71 <.0001 and with the summary you have your contrast with standard error: summary(test.lme) Fixed effects: Measurement ~ Material Value Std.Error DF t-value p-value (Intercept) 16.128056 0.07750547 24 208.08925 0e+00 Material14.522778 0.12203325 6 37.06185 0e+00 Material21.210556 0.12203325 6 9.91988 1e-04 Material31.553889 0.12203325 6 12.73332 0e+00 - Last but not least I tried it with R-devel and the original data frame: First I reset the contrast on the default value: testdata$Material <- C(testdata$Material, "contr.treatment", how.many = 3) I used your syntax and se.contrast(): test.aov <- with(testdata,aov(Measurement ~ Material + Error(Lab/Material))) se.contrast(test.aov, list(Material=="A",Material=="B",Material=="C",Material=="D"), coef=c(0.5,0.5,-0.5,-0.5),data=testdata) [1] 0.1432572 I got a different result and I have admit that I didn't understand why there is a differnce between the lme model and this one. There are some comments in the help pages but I'm not sure if this is the answer. It is. You used the default `constrasts', which are not actually contrasts. With options(contrasts=c("contr.helmert", "contr.poly")) it gives the same answer as the other two. Because you used non-contrasts there was an efficiency loss (to the Intercept stratum). -- Brian D. Ripley, [EMAIL PROTECTED] Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Re: [R] problem with se.contrast()
Dear Jamie I already thought that your data structure could be more complicated than in the example. I would be careful anywhere. Since there is a difference in the results of se.contrasts() in R-devel and the results from lme (and the with lme consistent results of the aggregated data) in this nice balanced design, I am suspicious, especially if you real design is more complicated. Even if you get no error message for your data, I'd calculate the analysis using for example lme for a second time to control the results. I wish you all the best for your analysis. Christoph Buser -- Christoph Buser <[EMAIL PROTECTED]> Seminar fuer Statistik, LEO C11 ETH (Federal Inst. Technology) 8092 Zurich SWITZERLAND phone: x-41-1-632-5414 fax: 632-1228 http://stat.ethz.ch/~buser/ -- Jamie Jarabek writes: > Christoph, > > Thank you for your advice. My actual design is indeed more complicated than > what > I have indicated here. I was just using this as a toy example illustrate my > particular problem. As suggested by Prof. Ripley I will download R-devel and > see > if the fixes included within alleviate my problems. > > Jamie Jarabek > __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Re: [R] problem with se.contrast()
Christoph, Thank you for your advice. My actual design is indeed more complicated than what I have indicated here. I was just using this as a toy example illustrate my particular problem. As suggested by Prof. Ripley I will download R-devel and see if the fixes included within alleviate my problems. Jamie Jarabek Christoph Buser wrote: Dear Jamie As Prof. Ripley explained your analysis is equivalent to the fixed effect models for the means, so you can calculate it by (if this is your design): Lab <- factor(rep(c("1","2","3"),each=12)) Material <- factor(rep(c("A","B","C","D"),each=3,times=3)) Measurement <- c(12.20,12.28,12.16,15.51,15.02,15.29,18.14,18.08,18.21, 18.54,18.36,18.45,12.59,12.30,12.67,14.98,15.46,15.22, 18.54,18.31,18.60,19.21,18.77,18.69,12.72,12.78,12.66, 15.33,15.19,15.24,18.00,18.15,17.93,18.88,18.12,18.03) testdata <- data.frame(Lab,Material,Measurement) rm(list=c("Lab","Material","Measurement")) You can aggregate your data dat.mean <- aggregate(testdata$Measurement, by = list(Material=testdata$Material,Lab=testdata$Lab), FUN = mean) names(dat.mean)[3] <- "Measurement" test.red.aov1 <- aov(Measurement ~ Lab + Material, data = dat.mean) se.contrast(test.red.aov1, list(Material=="A",Material=="B",Material=="C",Material=="D"), coef=c(0.5,0.5,-0.5,-0.5),dat.mean) [1] 0.1220339 By aggregating the data you bypass the problem in se.contrast and you do not need R-devel. - The second way to get the same is to set your contrast for the factor "Material" and calculate you model with this contrast and use summary.lm: dat.mean$Material <- C(dat.mean$Material, c(-0.5,-0.5,0.5,0.5), how.many = 3) test.red.aov2 <- aov(Measurement ~ Lab + Material, data = dat.mean) summary.lm(test.red.aov2) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 16.020000.10568 151.583 5.56e-12 *** Lab2 0.258330.14946 1.7280.135 Lab3 0.065830.14946 0.4400.675 Material14.522780.12203 37.062 2.58e-08 *** Material21.210560.12203 9.920 6.06e-05 *** Material31.553890.12203 12.733 1.44e-05 *** --- Signif. codes: 0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1 Material1 is now the contrast you are interested in and you get beside the Std. Error the Estimate, too. Material2 and Material3 are just orthogonal contrasts to complete. - The third way (which I prefer) is using lme from the package nlme library(nlme) Here I use the origianl data set (not aggregated) and set the desired contrast, too. testdata$Material <- C(testdata$Material, c(-0.5,-0.5,0.5,0.5), how.many = 3) test.lme <- lme(fixed = Measurement ~ Material, data = testdata, random = ~1|Lab/Material) With anova you get the F-Test for the fixed factor "Material" anova(test.lme) numDF denDF F-value p-value (Intercept) 124 43301.14 <.0001 Material3 6 544.71 <.0001 and with the summary you have your contrast with standard error: summary(test.lme) Fixed effects: Measurement ~ Material Value Std.Error DF t-value p-value (Intercept) 16.128056 0.07750547 24 208.08925 0e+00 Material14.522778 0.12203325 6 37.06185 0e+00 Material21.210556 0.12203325 6 9.91988 1e-04 Material31.553889 0.12203325 6 12.73332 0e+00 - Last but not least I tried it with R-devel and the original data frame: First I reset the contrast on the default value: testdata$Material <- C(testdata$Material, "contr.treatment", how.many = 3) I used your syntax and se.contrast(): test.aov <- with(testdata,aov(Measurement ~ Material + Error(Lab/Material))) se.contrast(test.aov, list(Material=="A",Material=="B",Material=="C",Material=="D"), coef=c(0.5,0.5,-0.5,-0.5),data=testdata) [1] 0.1432572 I got a different result and I have admit that I didn't understand why there is a differnce between the lme model and this one. There are some comments in the help pages but I'm not sure if this is the answer. - I hope some of the code above can help to analyse your data. Maybe Prof. Ripley was right and you have another design. Then you just can ignore this and your life is much more easier :-) Best regards, Christoph Buser -- Christoph Buser <[EMAIL PROTECTED]> Seminar fuer Statistik, LEO C11 ETH (Federal Inst. Technology) 8092 Zurich SWITZERLAND phone: x-41-1-632-5414 fax: 632-1228 http://stat.ethz.ch/~buser/ -
Re: [R] problem with se.contrast()
Dear Jamie As Prof. Ripley explained your analysis is equivalent to the fixed effect models for the means, so you can calculate it by (if this is your design): > Lab <- factor(rep(c("1","2","3"),each=12)) > Material <- factor(rep(c("A","B","C","D"),each=3,times=3)) > Measurement <- c(12.20,12.28,12.16,15.51,15.02,15.29,18.14,18.08,18.21, > 18.54,18.36,18.45,12.59,12.30,12.67,14.98,15.46,15.22, > 18.54,18.31,18.60,19.21,18.77,18.69,12.72,12.78,12.66, > 15.33,15.19,15.24,18.00,18.15,17.93,18.88,18.12,18.03) > testdata <- data.frame(Lab,Material,Measurement) > rm(list=c("Lab","Material","Measurement")) You can aggregate your data > dat.mean <- aggregate(testdata$Measurement, > by = list(Material=testdata$Material,Lab=testdata$Lab), > FUN = mean) > names(dat.mean)[3] <- "Measurement" > test.red.aov1 <- aov(Measurement ~ Lab + Material, data = dat.mean) > se.contrast(test.red.aov1, > list(Material=="A",Material=="B",Material=="C",Material=="D"), > coef=c(0.5,0.5,-0.5,-0.5),dat.mean) > [1] 0.1220339 By aggregating the data you bypass the problem in se.contrast and you do not need R-devel. - The second way to get the same is to set your contrast for the factor "Material" and calculate you model with this contrast and use summary.lm: > dat.mean$Material <- C(dat.mean$Material, c(-0.5,-0.5,0.5,0.5), >how.many = 3) > test.red.aov2 <- aov(Measurement ~ Lab + Material, data = dat.mean) > summary.lm(test.red.aov2) Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 16.020000.10568 151.583 5.56e-12 *** Lab2 0.258330.14946 1.7280.135 Lab3 0.065830.14946 0.4400.675 Material14.522780.12203 37.062 2.58e-08 *** Material21.210560.12203 9.920 6.06e-05 *** Material31.553890.12203 12.733 1.44e-05 *** --- Signif. codes: 0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1 Material1 is now the contrast you are interested in and you get beside the Std. Error the Estimate, too. Material2 and Material3 are just orthogonal contrasts to complete. - The third way (which I prefer) is using lme from the package nlme > library(nlme) Here I use the origianl data set (not aggregated) and set the desired contrast, too. > testdata$Material <- C(testdata$Material, c(-0.5,-0.5,0.5,0.5), >how.many = 3) > test.lme <- lme(fixed = Measurement ~ Material, data = testdata, > random = ~1|Lab/Material) With anova you get the F-Test for the fixed factor "Material" > anova(test.lme) > numDF denDF F-value p-value (Intercept) 124 43301.14 <.0001 Material3 6 544.71 <.0001 and with the summary you have your contrast with standard error: > summary(test.lme) Fixed effects: Measurement ~ Material Value Std.Error DF t-value p-value (Intercept) 16.128056 0.07750547 24 208.08925 0e+00 Material14.522778 0.12203325 6 37.06185 0e+00 Material21.210556 0.12203325 6 9.91988 1e-04 Material31.553889 0.12203325 6 12.73332 0e+00 - Last but not least I tried it with R-devel and the original data frame: First I reset the contrast on the default value: testdata$Material <- C(testdata$Material, "contr.treatment", how.many = 3) I used your syntax and se.contrast(): > test.aov <- with(testdata,aov(Measurement ~ Material + Error(Lab/Material))) > se.contrast(test.aov, > list(Material=="A",Material=="B",Material=="C",Material=="D"), > coef=c(0.5,0.5,-0.5,-0.5),data=testdata) > [1] 0.1432572 I got a different result and I have admit that I didn't understand why there is a differnce between the lme model and this one. There are some comments in the help pages but I'm not sure if this is the answer. - I hope some of the code above can help to analyse your data. Maybe Prof. Ripley was right and you have another design. Then you just can ignore this and your life is much more easier :-) Best regards, Christoph Buser -- Christoph Buser <[EMAIL PROTECTED]> Seminar fuer Statistik, LEO C11 ETH (Federal Inst. Technology) 8092 Zurich SWITZERLAND phone: x-41-1-632-5414 fax: 632-1228 http://stat.ethz.ch/~buser/ -- Jamie Jarabek writes: > I am having trouble getting standard errors for contrasts using > se.contrast() in > what appears to be a simple case to me. The following test example > illustrates > my problem: > > Lab
Re: [R] problem with se.contrast()
The first problem is that Material is undefined when you called se.contrast, as you removed it. This may work, but it certainly is not intentional and what variable gets picked up may not be predicted reliably. The second is that I think you need R-devel which has some fixes. However, I think this is an inappropriate analysis (which is why the older code trips up). Your analysis is equivalent to using the fixed-effects model Lab + Material on the means of each Material block. If there is something different about the replicates of the Material, this was a very poor design and so I suspect that is not what was done. It looks like a randomized block design with replication in the blocks, for which aov(Measurement ~ Lab + Material, data = testdata) is the appropriate analysis. In any case, there is no need of a multi-stratum analysis. On Wed, 16 Feb 2005, Jamie Jarabek wrote: I am having trouble getting standard errors for contrasts using se.contrast() in what appears to be a simple case to me. The following test example illustrates my problem: Lab <- factor(rep(c("1","2","3"),each=12)) Material <- factor(rep(c("A","B","C","D"),each=3,times=3)) Measurement <- c(12.20,12.28,12.16,15.51,15.02,15.29,18.14,18.08,18.21,18.54,18.36 ,18.45,12.59,12.30,12.67,14.98,15.46,15.22,18.54,18.31,18.60,19.21,18.77 ,18.69,12.72,12.78,12.66,15.33,15.19,15.24,18.00,18.15,17.93,18.88,18.12,18.03) testdata <- data.frame(Lab,Material,Measurement) rm(list=c("Lab","Material","Measurement")) test.aov <- with(testdata,aov(Measurement ~ Material + Error(Lab/Material))) This gives me the desired ANOVA table. I next want to get the standard errors for certain contrasts and following the help page for se.contrast() I tried the following but I get an error: se.contrast(test.aov,list(Material=="A",Material=="B",Material=="C",Material=="D"),coef=c(1,1,-1,-1),data=testdata) Error in matrix(0, length(asgn), ncol(effects), dimnames = list(nm[1 + : length of dimnames [1] not equal to array extent I have tested this on R 2.0.1 on Windows XP and Solaris and get the same error on both systems. I am unsure as to what I am doing wrong here. Thanks for any help. Jamie Jarabek __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html -- Brian D. Ripley, [EMAIL PROTECTED] Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
[R] problem with se.contrast()
I am having trouble getting standard errors for contrasts using se.contrast() in what appears to be a simple case to me. The following test example illustrates my problem: Lab <- factor(rep(c("1","2","3"),each=12)) Material <- factor(rep(c("A","B","C","D"),each=3,times=3)) Measurement <- c(12.20,12.28,12.16,15.51,15.02,15.29,18.14,18.08,18.21,18.54,18.36 ,18.45,12.59,12.30,12.67,14.98,15.46,15.22,18.54,18.31,18.60,19.21,18.77 ,18.69,12.72,12.78,12.66,15.33,15.19,15.24,18.00,18.15,17.93,18.88,18.12,18.03) testdata <- data.frame(Lab,Material,Measurement) rm(list=c("Lab","Material","Measurement")) test.aov <- with(testdata,aov(Measurement ~ Material + Error(Lab/Material))) This gives me the desired ANOVA table. I next want to get the standard errors for certain contrasts and following the help page for se.contrast() I tried the following but I get an error: se.contrast(test.aov,list(Material=="A",Material=="B",Material=="C",Material=="D"),coef=c(1,1,-1,-1),data=testdata) Error in matrix(0, length(asgn), ncol(effects), dimnames = list(nm[1 + : length of dimnames [1] not equal to array extent I have tested this on R 2.0.1 on Windows XP and Solaris and get the same error on both systems. I am unsure as to what I am doing wrong here. Thanks for any help. Jamie Jarabek __ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html