I tried thinning of the mcmc run with 500,000 iteration. It looks like 100 or 
200 is enough to remove the autocorrelation of a1 and tau. 
Is that too much thining? 

--- On Tue, 3/16/10, ping chen <chen1984...@yahoo.com.cn> wrote:

> From: ping chen <chen1984...@yahoo.com.cn>
> Subject: an ordinal regression MCMC run high correlation
> To: r-help@r-project.org
> Cc: chen1984...@yahoo.com.cn
> Date: Tuesday, March 16, 2010, 12:19 PM
> I am trying to model  a clusterd
> ordinal response data (either 1, 2 or 3) called , the
> correponding physician of the patient is also in the data.
>  
> Since it is ordinal, I used the ordinal logit model
>  
> topbox[i]~discrete with probability P[j,1],p[j,2], p[j,3],
> j is the corresponding physician of the ith patient
>  
> C[j] is the physician effect , a1 and a1+theta is the
> common cutpoints for all physicians
> 
> I generate 10,000 iteration and there are still high
> autocorrelation of a1 and tau. I thought 10,000 is a pretty
> big number and the chain converges really slow. I am a new
> MCMC user and don't know other ways to solve this problem.
> Will someone please give some suggestions that may apply to
> this specific modeling?
> 
> model  {
> for ( i in 1:N) {
> response[i]~dcat( p[physician[i], ] )
> }
> 
> for (j in 1:Nt) {
> p[j,1]<-1-Q[j,1]
> p[j,2]<-Q[j,1]-Q[j,2]
> p[j,3]<-Q[j,2]
> logit(Q[j,1])<--c[j]
> logit(Q[j,2])<--(c[j]+theta)
> score[j]<-0.5*p[j,2]+p[j,3]
> c[j]~dnorm(a1, tau)
> }
> a1~dnorm(0, 1.0E-06)
> theta~dnorm(0, 1.0E-06)I(0,)
> tau~dgamma(0.001,0.001)
> }
> 
> list(N=667,Nt=50)
> 
>  
> Thanks, Ping
> 
> 
> 
>

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to