[R] Loop for in R to generate several variables

2008-04-07 Thread arpino

Hi everybody,
I have to create several variables of this form:

Yind = L0 + L1*X1 + L2*X2 + L3*X3 + K*Cind + n

where ind varires in {1,...,10}

I thought to this loop for but it does not work:

for (ind in 1:10) {

 Yind = L0 + L1*X1 + L2*X2 + L3*X3 + K*Cind + n


}

Any suggestions?

Thank you.


-- 
View this message in context: 
http://www.nabble.com/Loop-for-in-R-to-generate-several-variables-tp16536683p16536683.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How to predict probabilities after using lmer

2008-11-25 Thread arpino

Dear R-users,
I'm using lmer to fit two-level logistic models and I'm interested in
predicted probabilities that I get in this way (using "fitted"):

glm1 = lmer(XY$T1~X1 + X2 + X3 + (1|Cind), family=binomial) #estimation of a
two-level logit model

   fit1=fitted(glm1) # I get the fitted linear predictor
 
   ilog = function(x) { 1/(1 + exp(-x)) }
 
ps1=ilog(fit1) # In order to get the estimated probabilities


Is this procedure correct? In this way I'm getting the "conditional
probabilities", right? Is there any function I can use in order to get the
"empirical bayes (EB) probabilities"? Any suggestion?
And more generally, can you suggest me any paper/textbook/notes clarifying
when it's more suitable to use one kind of probability than the other?

Here are the formulas for what I labelled as conditional and EB probability:

The model is: logit(P(Y=1)) = a + bX + u

conditional: P(Y=1/u=u^) = 1/(1 + exp(-(a^ + b^X + u^)))

EB: ∫[1/(1 + exp(-(a^ + b^X + u)))] x Posterior (u/Y, X) du

(u is the random effect; ^ indicates estimated)

Many thanks
 



 

-- 
View this message in context: 
http://www.nabble.com/How-to-predict-probabilities-after-using-lmer-tp20678825p20678825.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] predicted probabilities after lmer

2008-12-02 Thread arpino

Dear R-users,
I'm using lmer to fit two-level logistic models and I'm interested in
predicted probabilities that I get in this way (using "fitted"):

glm1 = lmer(XY$T1~X1 + X2 + X3 + (1|Cind), family=binomial) #estimation of a
two-level logit model

   fit1=fitted(glm1) # I get the fitted linear predictor
 
   ilog = function(x) { 1/(1 + exp(-x)) }
 
ps1=ilog(fit1) # In order to get the estimated probabilities


Is this procedure correct? In this way I'm getting the "conditional
probabilities", right? Is there any function I can use in order to get the
"empirical bayes (EB) probabilities"? Any suggestion?
And more generally, can you suggest me any paper/textbook/notes clarifying
when it's more suitable to use one kind of probability than the other?

Here are the formulas for what I labelled as conditional and EB probability:

The model is: logit(P(Y=1)) = a + bX + u

conditional: P(Y=1/u=u^) = 1/(1 + exp(-(a^ + b^X + u^)))

EB: ∫[1/(1 + exp(-(a^ + b^X + u)))] x Posterior (u/Y, X) du

(u is the random effect; ^ indicates estimated)

Many thanks
 



 

-- 
View this message in context: 
http://www.nabble.com/predicted-probabilities-after-lmer-tp20796391p20796391.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.