Re: [R] Weird SEs with effect()

2008-02-18 Thread Prof Brian Ripley
On Mon, 18 Feb 2008, Gustaf Granath wrote:

> Dear John & Brian,
>
> Ok. Now I start to understand. So basically  I cannot use the given SEs for 
> my purposes. They only make sense on the scale of log-counts. However in a 
> paper, the log-count scale is not very informative (the reader want to see 
> the effect on the scale you measure). If I understand you right, the 
> confidence intervals are fine though and maybe I will use them to illustrate 
> the reliability of the estimate. The problem is that showing SEs of the 
> adjusted means has become sort of the standard way to illustrate things in my 
> field (too many SAS users ?) and I might run into trouble with the reviewers. 
> I have several data sets with similar data and my plan was to use effect(). 
> That is why I want to figure this out. I hope I haven't been too annoying ;).
>
> Finally, is there a way to get correct SEs on the count scale (with adjusted 
> means)??
> I guess not, judging by your answers.

Yes, there is, at least approximately.  If X has mean mu and se s,
exp(X) has approximate mean exp(mu) and se s*exp(mu).  As in

> X <- rnorm(1000, 10, 0.1)
> sd(X)
[1] 0.09811928
> sd(exp(X))
[1] 2172.124
> exp(10)
[1] 22026.47

You will find slightly more accurate formulae on the help page ?plnorm
(they are exact if you assume normality of the estimated effects).


>
> Thanks again for your help,
>
> Gustaf
>
>
> John Fox wrote:
>> Dear Gustaf,
>>
>> 
>>> -Original Message-
>>> From: Gustaf Granath [mailto:[EMAIL PROTECTED]
>>> Sent: February-17-08 4:18 PM
>>> To: John Fox
>>> Cc: 'Prof Brian Ripley'; r-help@r-project.org
>>> Subject: RE: [R] Weird SEs with effect()
>>> 
>>> Dear John and Brian,
>>> Thank you for your help. I get the feeling that it is something
>>> fundamental that I do not understand here. Furthermore, a day of
>>> reading did not really help so maybe we have reached a dead end here.
>>> Nevertheless, here comes one last try.
>>> 
>>> I thought that the values produced by effect() were logs (e.g. in
>>> $fit). And then they were transformed (antilogged) with summary(). Was
>>> I wrong?
>>> 
>> 
>> I'm sorry that you're continuing to have problems with this.
>> 
>> Yes, there is a point that you don't understand: The SEs are on the scale 
>> of
>> the log-counts, but you can't get correct SEs on the scale of the counts by
>> exponentiating the SEs on the scale of the log-counts. What summary(), 
>> etc.,
>> do (and you can do) to produce confidence intervals on the count scale is
>> first to compute the intervals on the log-count scale and then to transform
>> the end-points.
>> 
>> I'm afraid that I can't make the point more clearly than that.
>> 
>> I hope this helps,
>>  John
>>
>> 
>>> What I want:
>>> I am trying to make a barplot with adjusted means with SEs (error
>>> bars), with the y axis labeled on the response scale.
>>> 
>>> #One of my GLM models (inf.level & def.level=factors, initial.size =
>>> covariate) #used as an example.
>>> #I was not able to make a reproducible example though. Sorry.
>>> 
>>> model <-
>>> glm(tot.fruit~initial.size+inf.level+def.level,family=quasipoisson)
>>> summary(model)
>>> Coefficients:
>>>  Estimate Std. Error t value Pr(>|t|)
>>> (Intercept)1.9368528  0.1057948  18.308  < 2e-16 ***
>>> initial.size   0.0015245  0.0001134  13.443  < 2e-16 ***
>>> inf.level50   -0.3142688  0.0908063  -3.461 0.000612 ***
>>> def.level12.5 -0.2329221  0.1236992  -1.883 0.060620 .
>>> def.level25   -0.1722354  0.1181993  -1.457 0.146062
>>> def.level50   -0.3543826  0.1212906  -2.922 0.003731 **
>>> 
>>> (Dispersion parameter for quasipoisson family taken to be 6.431139)
>>>  Null deviance: 2951.5  on 322  degrees of freedom
>>> Residual deviance: 1917.2  on 317  degrees of freedom
>>> 
>>> library(effects)
>>> def <- effect("def.level",model,se=TRUE)
>>> summary(def)
>>> $effect
>>> def.level
>>>  0  12.52550
>>> 11.145382  8.829541  9.381970  7.819672
>>> $lower
>>> def.level
>>> 0 12.5   25   50
>>> 9.495220 7.334297 7.867209 6.467627
>>> $upper
>>> def.level
>>> 0 12.5   25   50
>>> 13.08232 10.62962 11.18838  9.45436

Re: [R] Weird SEs with effect()

2008-02-18 Thread Gustaf Granath
Dear John & Brian,

Ok. Now I start to understand. So basically  I cannot use the given SEs 
for my purposes. They only make sense on the scale of log-counts. 
However in a paper, the log-count scale is not very informative (the 
reader want to see the effect on the scale you measure). If I understand 
you right, the confidence intervals are fine though and maybe I will use 
them to illustrate the reliability of the estimate. The problem is that 
showing SEs of the adjusted means has become sort of the standard way to 
illustrate things in my field (too many SAS users ?) and I might run 
into trouble with the reviewers. I have several data sets with similar 
data and my plan was to use effect(). That is why I want to figure this 
out. I hope I haven't been too annoying ;).

Finally, is there a way to get correct SEs on the count scale (with 
adjusted means)??
I guess not, judging by your answers.

Thanks again for your help,

Gustaf


John Fox wrote:
> Dear Gustaf,
>
>   
>> -Original Message-
>> From: Gustaf Granath [mailto:[EMAIL PROTECTED]
>> Sent: February-17-08 4:18 PM
>> To: John Fox
>> Cc: 'Prof Brian Ripley'; r-help@r-project.org
>> Subject: RE: [R] Weird SEs with effect()
>>
>> Dear John and Brian,
>> Thank you for your help. I get the feeling that it is something
>> fundamental that I do not understand here. Furthermore, a day of
>> reading did not really help so maybe we have reached a dead end here.
>> Nevertheless, here comes one last try.
>>
>> I thought that the values produced by effect() were logs (e.g. in
>> $fit). And then they were transformed (antilogged) with summary(). Was
>> I wrong?
>> 
>
> I'm sorry that you're continuing to have problems with this.
>
> Yes, there is a point that you don't understand: The SEs are on the scale of
> the log-counts, but you can't get correct SEs on the scale of the counts by
> exponentiating the SEs on the scale of the log-counts. What summary(), etc.,
> do (and you can do) to produce confidence intervals on the count scale is
> first to compute the intervals on the log-count scale and then to transform
> the end-points.
>
> I'm afraid that I can't make the point more clearly than that.
>
> I hope this helps,
>  John
>
>   
>> What I want:
>> I am trying to make a barplot with adjusted means with SEs (error
>> bars), with the y axis labeled on the response scale.
>>
>> #One of my GLM models (inf.level & def.level=factors, initial.size =
>> covariate) #used as an example.
>> #I was not able to make a reproducible example though. Sorry.
>>
>> model <-
>> glm(tot.fruit~initial.size+inf.level+def.level,family=quasipoisson)
>> summary(model)
>> Coefficients:
>>  Estimate Std. Error t value Pr(>|t|)
>> (Intercept)1.9368528  0.1057948  18.308  < 2e-16 ***
>> initial.size   0.0015245  0.0001134  13.443  < 2e-16 ***
>> inf.level50   -0.3142688  0.0908063  -3.461 0.000612 ***
>> def.level12.5 -0.2329221  0.1236992  -1.883 0.060620 .
>> def.level25   -0.1722354  0.1181993  -1.457 0.146062
>> def.level50   -0.3543826  0.1212906  -2.922 0.003731 **
>>
>> (Dispersion parameter for quasipoisson family taken to be 6.431139)
>>  Null deviance: 2951.5  on 322  degrees of freedom
>> Residual deviance: 1917.2  on 317  degrees of freedom
>>
>> library(effects)
>> def <- effect("def.level",model,se=TRUE)
>> summary(def)
>> $effect
>> def.level
>>  0  12.52550
>> 11.145382  8.829541  9.381970  7.819672
>> $lower
>> def.level
>> 0 12.5   25   50
>> 9.495220 7.334297 7.867209 6.467627
>> $upper
>> def.level
>> 0 12.5   25   50
>> 13.08232 10.62962 11.18838  9.45436
>> #Confidence intervals makes sense and are in line with the glm model
>> result. Now #lets look at the standard errors. Btw, why aren't they
>> given with summary?
>> def$se
>> 324325326327
>> 0.08144281 0.09430438 0.08949864 0.09648573
>> # As you can see, the SEs are very very very small.
>> #In a graph it would look weird in combination with the glm result.
>> #I thought that these values were logs. Thats why I used exp() which
>> seems to be wrong.
>>
>> Regards,
>>
>> Gustaf
>>
>>
>> 
>>> Quoting John Fox <[EMAIL PROTECTED]>:
>>> Dear Brian and Gustaf,
>>>
>>> I too have a bit of trouble following what Gustaf is doing, but I
>>>   
&

Re: [R] Weird SEs with effect()

2008-02-17 Thread John Fox
Dear Gustaf,

> -Original Message-
> From: Gustaf Granath [mailto:[EMAIL PROTECTED]
> Sent: February-17-08 4:18 PM
> To: John Fox
> Cc: 'Prof Brian Ripley'; r-help@r-project.org
> Subject: RE: [R] Weird SEs with effect()
> 
> Dear John and Brian,
> Thank you for your help. I get the feeling that it is something
> fundamental that I do not understand here. Furthermore, a day of
> reading did not really help so maybe we have reached a dead end here.
> Nevertheless, here comes one last try.
> 
> I thought that the values produced by effect() were logs (e.g. in
> $fit). And then they were transformed (antilogged) with summary(). Was
> I wrong?

I'm sorry that you're continuing to have problems with this.

Yes, there is a point that you don't understand: The SEs are on the scale of
the log-counts, but you can't get correct SEs on the scale of the counts by
exponentiating the SEs on the scale of the log-counts. What summary(), etc.,
do (and you can do) to produce confidence intervals on the count scale is
first to compute the intervals on the log-count scale and then to transform
the end-points.

I'm afraid that I can't make the point more clearly than that.

I hope this helps,
 John

> 
> What I want:
> I am trying to make a barplot with adjusted means with SEs (error
> bars), with the y axis labeled on the response scale.
> 
> #One of my GLM models (inf.level & def.level=factors, initial.size =
> covariate) #used as an example.
> #I was not able to make a reproducible example though. Sorry.
> 
> model <-
> glm(tot.fruit~initial.size+inf.level+def.level,family=quasipoisson)
> summary(model)
> Coefficients:
>  Estimate Std. Error t value Pr(>|t|)
> (Intercept)1.9368528  0.1057948  18.308  < 2e-16 ***
> initial.size   0.0015245  0.0001134  13.443  < 2e-16 ***
> inf.level50   -0.3142688  0.0908063  -3.461 0.000612 ***
> def.level12.5 -0.2329221  0.1236992  -1.883 0.060620 .
> def.level25   -0.1722354  0.1181993  -1.457 0.146062
> def.level50   -0.3543826  0.1212906  -2.922 0.003731 **
> 
> (Dispersion parameter for quasipoisson family taken to be 6.431139)
>  Null deviance: 2951.5  on 322  degrees of freedom
> Residual deviance: 1917.2  on 317  degrees of freedom
> 
> library(effects)
> def <- effect("def.level",model,se=TRUE)
> summary(def)
> $effect
> def.level
>  0  12.52550
> 11.145382  8.829541  9.381970  7.819672
> $lower
> def.level
> 0 12.5   25   50
> 9.495220 7.334297 7.867209 6.467627
> $upper
> def.level
> 0 12.5   25   50
> 13.08232 10.62962 11.18838  9.45436
> #Confidence intervals makes sense and are in line with the glm model
> result. Now #lets look at the standard errors. Btw, why aren't they
> given with summary?
> def$se
> 324325326327
> 0.08144281 0.09430438 0.08949864 0.09648573
> # As you can see, the SEs are very very very small.
> #In a graph it would look weird in combination with the glm result.
> #I thought that these values were logs. Thats why I used exp() which
> seems to be wrong.
> 
> Regards,
> 
> Gustaf
> 
> 
> > Quoting John Fox <[EMAIL PROTECTED]>:
> > Dear Brian and Gustaf,
> >
> > I too have a bit of trouble following what Gustaf is doing, but I
> think that
> > Brian's interpretation -- that Gustaf is trying to transform the
> standard
> > errors via the inverse link rather than transforming the ends of the
> > confidence intervals -- is probably correct. If this is the case,
> then what
> > Gustaf has done doesn't make sense.
> >
> > It is possible to get standard errors on the scale of the response
> (using,
> > e.g., the delta method), but it's probably better to work on the
> scale of
> > the linear predictor anyway. This is what the summary, print, and
> plot
> > methods in the effects package do (as is documented in the help files
> for
> > the package -- see the transformation argument under ?effect and the
> type
> > argument under ?summary.eff).
> >
> > Regards,
> >  John
> >
> > 
> > John Fox, Professor
> > Department of Sociology
> > McMaster University
> > Hamilton, Ontario, Canada L8S 4M4
> > 905-525-9140x23604
> > http://socserv.mcmaster.ca/jfox
> >
> >
> >> -Original Message-
> >> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED]
> >> Sent: February-17-08 6:42 AM
> >> To: Gustaf Granath
> >> Cc: John Fox; r-help@r-project.org
> >> Subject: Re

Re: [R] Weird SEs with effect()

2008-02-17 Thread Gustaf Granath
Dear John and Brian,
Thank you for your help. I get the feeling that it is something  
fundamental that I do not understand here. Furthermore, a day of  
reading did not really help so maybe we have reached a dead end here.  
Nevertheless, here comes one last try.

I thought that the values produced by effect() were logs (e.g. in  
$fit). And then they were transformed (antilogged) with summary(). Was  
I wrong?

What I want:
I am trying to make a barplot with adjusted means with SEs (error  
bars), with the y axis labeled on the response scale.

#One of my GLM models (inf.level & def.level=factors, initial.size =  
covariate) #used as an example.
#I was not able to make a reproducible example though. Sorry.

model <- glm(tot.fruit~initial.size+inf.level+def.level,family=quasipoisson)
summary(model)
Coefficients:
 Estimate Std. Error t value Pr(>|t|)
(Intercept)1.9368528  0.1057948  18.308  < 2e-16 ***
initial.size   0.0015245  0.0001134  13.443  < 2e-16 ***
inf.level50   -0.3142688  0.0908063  -3.461 0.000612 ***
def.level12.5 -0.2329221  0.1236992  -1.883 0.060620 .
def.level25   -0.1722354  0.1181993  -1.457 0.146062
def.level50   -0.3543826  0.1212906  -2.922 0.003731 **

(Dispersion parameter for quasipoisson family taken to be 6.431139)
 Null deviance: 2951.5  on 322  degrees of freedom
Residual deviance: 1917.2  on 317  degrees of freedom

library(effects)
def <- effect("def.level",model,se=TRUE)
summary(def)
$effect
def.level
 0  12.52550
11.145382  8.829541  9.381970  7.819672
$lower
def.level
0 12.5   25   50
9.495220 7.334297 7.867209 6.467627
$upper
def.level
0 12.5   25   50
13.08232 10.62962 11.18838  9.45436
#Confidence intervals makes sense and are in line with the glm model  
result. Now #lets look at the standard errors. Btw, why aren't they  
given with summary?
def$se
324325326327
0.08144281 0.09430438 0.08949864 0.09648573
# As you can see, the SEs are very very very small.
#In a graph it would look weird in combination with the glm result.
#I thought that these values were logs. Thats why I used exp() which  
seems to be wrong.

Regards,

Gustaf


> Quoting John Fox <[EMAIL PROTECTED]>:
> Dear Brian and Gustaf,
>
> I too have a bit of trouble following what Gustaf is doing, but I think that
> Brian's interpretation -- that Gustaf is trying to transform the standard
> errors via the inverse link rather than transforming the ends of the
> confidence intervals -- is probably correct. If this is the case, then what
> Gustaf has done doesn't make sense.
>
> It is possible to get standard errors on the scale of the response (using,
> e.g., the delta method), but it's probably better to work on the scale of
> the linear predictor anyway. This is what the summary, print, and plot
> methods in the effects package do (as is documented in the help files for
> the package -- see the transformation argument under ?effect and the type
> argument under ?summary.eff).
>
> Regards,
>  John
>
> 
> John Fox, Professor
> Department of Sociology
> McMaster University
> Hamilton, Ontario, Canada L8S 4M4
> 905-525-9140x23604
> http://socserv.mcmaster.ca/jfox
>
>
>> -Original Message-
>> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED]
>> Sent: February-17-08 6:42 AM
>> To: Gustaf Granath
>> Cc: John Fox; r-help@r-project.org
>> Subject: Re: [R] Weird SEs with effect()
>>
>> On Sun, 17 Feb 2008, Gustaf Granath wrote:
>>
>> > Hi John,
>> >
>> > In fact I am still a little bit confused because I had read the
>> > ?effect help and the archives.
>> >
>> > ?effect says that the confidence intervals are on the linear
>> predictor
>> > scale as well. Using exp() on the untransformed confidence intervals
>> > gives me the same values as summary(eff). My confidence intervals
>> > seems to be correct and reflects the results from my glm models.
>> >
>> > But when I use exp() to get the correct SEs on the response scale I
>> > get SEs that sometimes do not make sense at all. Interestingly I have
>>
>> What exactly are you doing here?  I suspect you are not using the
>> correct
>> formula to transform the SEs (you do not just exponeniate them), but
>> without the reproducible example asked for we cannot tell.
>>
>> > found a trend. For my model with adjusted means ~ 0.5-1.5 I get huge
>> > SEs (SEs > 1, but my glm model shows significant differences between
>> > level 1 = 0.55 and level 2 = 1.15). Models with means around 10-20 my
>> > SEs are fine with exp(). Models with means aro

Re: [R] Weird SEs with effect()

2008-02-17 Thread John Fox
Dear Brian and Gustaf,

I too have a bit of trouble following what Gustaf is doing, but I think that
Brian's interpretation -- that Gustaf is trying to transform the standard
errors via the inverse link rather than transforming the ends of the
confidence intervals -- is probably correct. If this is the case, then what
Gustaf has done doesn't make sense. 

It is possible to get standard errors on the scale of the response (using,
e.g., the delta method), but it's probably better to work on the scale of
the linear predictor anyway. This is what the summary, print, and plot
methods in the effects package do (as is documented in the help files for
the package -- see the transformation argument under ?effect and the type
argument under ?summary.eff).

Regards,
 John


John Fox, Professor
Department of Sociology
McMaster University
Hamilton, Ontario, Canada L8S 4M4
905-525-9140x23604
http://socserv.mcmaster.ca/jfox


> -Original Message-
> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED]
> Sent: February-17-08 6:42 AM
> To: Gustaf Granath
> Cc: John Fox; r-help@r-project.org
> Subject: Re: [R] Weird SEs with effect()
> 
> On Sun, 17 Feb 2008, Gustaf Granath wrote:
> 
> > Hi John,
> >
> > In fact I am still a little bit confused because I had read the
> > ?effect help and the archives.
> >
> > ?effect says that the confidence intervals are on the linear
> predictor
> > scale as well. Using exp() on the untransformed confidence intervals
> > gives me the same values as summary(eff). My confidence intervals
> > seems to be correct and reflects the results from my glm models.
> >
> > But when I use exp() to get the correct SEs on the response scale I
> > get SEs that sometimes do not make sense at all. Interestingly I have
> 
> What exactly are you doing here?  I suspect you are not using the
> correct
> formula to transform the SEs (you do not just exponeniate them), but
> without the reproducible example asked for we cannot tell.
> 
> > found a trend. For my model with adjusted means ~ 0.5-1.5 I get huge
> > SEs (SEs > 1, but my glm model shows significant differences between
> > level 1 = 0.55 and level 2 = 1.15). Models with means around 10-20 my
> > SEs are fine with exp(). Models with means around 75-125 my SEs get
> > way too small with exp().
> >
> > Something is not right here (or maybe they are but I don not
> > understand it) so I think my best option will be to use the
> confidence
> > intervals instead of SEs in my plot.
> 
> If you want confidence intervals, you are better off computing those on
> a
> reasonable scale and transforming then.  Or using a profile likelihood
> to
> compute them (which will be equivariant under monotone scale
> transformations).
> 
> > Regards,
> >
> > Gustaf
> >
> >
> >> Quoting John Fox <[EMAIL PROTECTED]>:
> >>
> >> Dear Gustaf,
> >>
> >> From ?effect, "se: a vector of standard errors for the effect, on
> the scale
> >> of the linear predictor." Does that help?
> >>
> >> Regards,
> >>  John
> >>
> >> 
> >> John Fox, Professor
> >> Department of Sociology
> >> McMaster University
> >> Hamilton, Ontario, Canada L8S 4M4
> >> 905-525-9140x23604
> >> http://socserv.mcmaster.ca/jfox
> >>
> >>
> >>> -Original Message-
> >>> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> >>> project.org] On Behalf Of Gustaf Granath
> >>> Sent: February-16-08 11:43 AM
> >>> To: r-help@r-project.org
> >>> Subject: [R] Weird SEs with effect()
> >>>
> >>> Hi all,
> >>>
> >>> Im a little bit confused concerning the effect() command, effects
> >>> package.
> >>> I have done several glm models with family=quasipoisson:
> >>>
> >>> model <-glm(Y~X+Q+Z,family=quasipoisson)
> >>>
> >>> and then used
> >>>
> >>> results.effects <-effect("X",model,se=TRUE)
> >>>
> >>> to get the "adjusted means". I am aware about the debate concerning
> >>> adjusted means, but you guys just have to trust me - it makes sense
> >>> for me.
> >>> Now I want standard error for these means.
> >>>
> >>> results.effects$se
> >>>
> >>> gives me standard error, but it is now it starts to get confusing.
> The
> >>> given standard errors are very very very sma

Re: [R] Weird SEs with effect()

2008-02-17 Thread Prof Brian Ripley
On Sun, 17 Feb 2008, Gustaf Granath wrote:

> Hi John,
>
> In fact I am still a little bit confused because I had read the
> ?effect help and the archives.
>
> ?effect says that the confidence intervals are on the linear predictor
> scale as well. Using exp() on the untransformed confidence intervals
> gives me the same values as summary(eff). My confidence intervals
> seems to be correct and reflects the results from my glm models.
>
> But when I use exp() to get the correct SEs on the response scale I
> get SEs that sometimes do not make sense at all. Interestingly I have

What exactly are you doing here?  I suspect you are not using the correct 
formula to transform the SEs (you do not just exponeniate them), but 
without the reproducible example asked for we cannot tell.

> found a trend. For my model with adjusted means ~ 0.5-1.5 I get huge
> SEs (SEs > 1, but my glm model shows significant differences between
> level 1 = 0.55 and level 2 = 1.15). Models with means around 10-20 my
> SEs are fine with exp(). Models with means around 75-125 my SEs get
> way too small with exp().
>
> Something is not right here (or maybe they are but I don not
> understand it) so I think my best option will be to use the confidence
> intervals instead of SEs in my plot.

If you want confidence intervals, you are better off computing those on a 
reasonable scale and transforming then.  Or using a profile likelihood to 
compute them (which will be equivariant under monotone scale 
transformations).

> Regards,
>
> Gustaf
>
>
>> Quoting John Fox <[EMAIL PROTECTED]>:
>>
>> Dear Gustaf,
>>
>> From ?effect, "se: a vector of standard errors for the effect, on the scale
>> of the linear predictor." Does that help?
>>
>> Regards,
>>  John
>>
>> 
>> John Fox, Professor
>> Department of Sociology
>> McMaster University
>> Hamilton, Ontario, Canada L8S 4M4
>> 905-525-9140x23604
>> http://socserv.mcmaster.ca/jfox
>>
>>
>>> -Original Message-
>>> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
>>> project.org] On Behalf Of Gustaf Granath
>>> Sent: February-16-08 11:43 AM
>>> To: r-help@r-project.org
>>> Subject: [R] Weird SEs with effect()
>>>
>>> Hi all,
>>>
>>> Im a little bit confused concerning the effect() command, effects
>>> package.
>>> I have done several glm models with family=quasipoisson:
>>>
>>> model <-glm(Y~X+Q+Z,family=quasipoisson)
>>>
>>> and then used
>>>
>>> results.effects <-effect("X",model,se=TRUE)
>>>
>>> to get the "adjusted means". I am aware about the debate concerning
>>> adjusted means, but you guys just have to trust me - it makes sense
>>> for me.
>>> Now I want standard error for these means.
>>>
>>> results.effects$se
>>>
>>> gives me standard error, but it is now it starts to get confusing. The
>>> given standard errors are very very very small - not realistic. I
>>> thought that maybe these standard errors are not back transformed so I
>>> used exp() and then the standard errors became realistic. However, for
>>> one of my glm models with quasipoisson the standard errors make kind
>>> of sense without using exp() and gets way to big if I use exp(). To be
>>> honest, I get the feeling that Im on the wrong track here.
>>>
>>> Basically, I want to know how SE is calculated in effect() (all I know
>>> is that the reported standard errors are for the fitted values) and if
>>> anyone knows what is going on here.
>>>
>>> Regards,
>>>
>>> Gustaf Granath
>>>
>>> __
>>> R-help@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-
>>> guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>
>>
>
> __
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Weird SEs with effect()

2008-02-17 Thread Gustaf Granath
Hi John,

In fact I am still a little bit confused because I had read the  
?effect help and the archives.

?effect says that the confidence intervals are on the linear predictor  
scale as well. Using exp() on the untransformed confidence intervals  
gives me the same values as summary(eff). My confidence intervals  
seems to be correct and reflects the results from my glm models.

But when I use exp() to get the correct SEs on the response scale I  
get SEs that sometimes do not make sense at all. Interestingly I have  
found a trend. For my model with adjusted means ~ 0.5-1.5 I get huge  
SEs (SEs > 1, but my glm model shows significant differences between  
level 1 = 0.55 and level 2 = 1.15). Models with means around 10-20 my  
SEs are fine with exp(). Models with means around 75-125 my SEs get  
way too small with exp().

Something is not right here (or maybe they are but I don not  
understand it) so I think my best option will be to use the confidence  
intervals instead of SEs in my plot.

Regards,

Gustaf


> Quoting John Fox <[EMAIL PROTECTED]>:
>
> Dear Gustaf,
>
> From ?effect, "se: a vector of standard errors for the effect, on the scale
> of the linear predictor." Does that help?
>
> Regards,
>  John
>
> 
> John Fox, Professor
> Department of Sociology
> McMaster University
> Hamilton, Ontario, Canada L8S 4M4
> 905-525-9140x23604
> http://socserv.mcmaster.ca/jfox
>
>
>> -Original Message-
>> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
>> project.org] On Behalf Of Gustaf Granath
>> Sent: February-16-08 11:43 AM
>> To: r-help@r-project.org
>> Subject: [R] Weird SEs with effect()
>>
>> Hi all,
>>
>> Im a little bit confused concerning the effect() command, effects
>> package.
>> I have done several glm models with family=quasipoisson:
>>
>> model <-glm(Y~X+Q+Z,family=quasipoisson)
>>
>> and then used
>>
>> results.effects <-effect("X",model,se=TRUE)
>>
>> to get the "adjusted means". I am aware about the debate concerning
>> adjusted means, but you guys just have to trust me - it makes sense
>> for me.
>> Now I want standard error for these means.
>>
>> results.effects$se
>>
>> gives me standard error, but it is now it starts to get confusing. The
>> given standard errors are very very very small - not realistic. I
>> thought that maybe these standard errors are not back transformed so I
>> used exp() and then the standard errors became realistic. However, for
>> one of my glm models with quasipoisson the standard errors make kind
>> of sense without using exp() and gets way to big if I use exp(). To be
>> honest, I get the feeling that Im on the wrong track here.
>>
>> Basically, I want to know how SE is calculated in effect() (all I know
>> is that the reported standard errors are for the fitted values) and if
>> anyone knows what is going on here.
>>
>> Regards,
>>
>> Gustaf Granath
>>
>> __
>> R-help@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-
>> guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
>

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Weird SEs with effect()

2008-02-16 Thread John Fox
Dear Gustaf,

>From ?effect, "se: a vector of standard errors for the effect, on the scale
of the linear predictor." Does that help?

Regards,
 John


John Fox, Professor
Department of Sociology
McMaster University
Hamilton, Ontario, Canada L8S 4M4
905-525-9140x23604
http://socserv.mcmaster.ca/jfox


> -Original Message-
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> project.org] On Behalf Of Gustaf Granath
> Sent: February-16-08 11:43 AM
> To: r-help@r-project.org
> Subject: [R] Weird SEs with effect()
> 
> Hi all,
> 
> Im a little bit confused concerning the effect() command, effects
> package.
> I have done several glm models with family=quasipoisson:
> 
> model <-glm(Y~X+Q+Z,family=quasipoisson)
> 
> and then used
> 
> results.effects <-effect("X",model,se=TRUE)
> 
> to get the "adjusted means". I am aware about the debate concerning
> adjusted means, but you guys just have to trust me - it makes sense
> for me.
> Now I want standard error for these means.
> 
> results.effects$se
> 
> gives me standard error, but it is now it starts to get confusing. The
> given standard errors are very very very small - not realistic. I
> thought that maybe these standard errors are not back transformed so I
> used exp() and then the standard errors became realistic. However, for
> one of my glm models with quasipoisson the standard errors make kind
> of sense without using exp() and gets way to big if I use exp(). To be
> honest, I get the feeling that Im on the wrong track here.
> 
> Basically, I want to know how SE is calculated in effect() (all I know
> is that the reported standard errors are for the fitted values) and if
> anyone knows what is going on here.
> 
> Regards,
> 
> Gustaf Granath
> 
> __
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-
> guide.html
> and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Weird SEs with effect()

2008-02-16 Thread Gustaf Granath
Hi all,

Im a little bit confused concerning the effect() command, effects package.
I have done several glm models with family=quasipoisson:

model <-glm(Y~X+Q+Z,family=quasipoisson)

and then used

results.effects <-effect("X",model,se=TRUE)

to get the "adjusted means". I am aware about the debate concerning  
adjusted means, but you guys just have to trust me - it makes sense  
for me.
Now I want standard error for these means.

results.effects$se

gives me standard error, but it is now it starts to get confusing. The  
given standard errors are very very very small - not realistic. I  
thought that maybe these standard errors are not back transformed so I  
used exp() and then the standard errors became realistic. However, for  
one of my glm models with quasipoisson the standard errors make kind  
of sense without using exp() and gets way to big if I use exp(). To be  
honest, I get the feeling that Im on the wrong track here.

Basically, I want to know how SE is calculated in effect() (all I know  
is that the reported standard errors are for the fitted values) and if  
anyone knows what is going on here.

Regards,

Gustaf Granath

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.