Hi all,

Trying to understand the logistic regression performed by glm (i.e. when 
family='binomial'), and I'm curious to know how it treats perfect 
success. That is, lets say I have the following summary data

        x=c(1,2,3,4,5,6)
        y=c(0,.04,.26,.76,.94,1)
        w=c(100,100,100,100,100,100)

where x is y is the probability of success at each value of x, 
calculated across w observations. When I use glm

        my.glm.obj=glm(y~x,family='binomial',weights=w)

the regression comes out fine, but if I try what I understand to be the 
equivalent lm procedure (i.e. fitting a straight line to the logit 
transformed y values):

        my.lm.obj=lm(qlogis(y)~x,weights=w)

I get an error because, of course, logit(1) = log(1/0) = log(Inf) = Inf
(similarly, logit(0) = log(0/1) = log(0) = -Inf).

I'd be very interested to see how glm deals with these extremes.

Cheers,

Mike


-- 
Mike Lawrence
http://artsweb.uwaterloo.ca/~m4lawren

"The road to wisdom? Well, it's plain and simple to express:
Err and err and err again, but less and less and less."
- Piet Hein

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to