Hi,

I am trying to learn lognormal mixture models with EM.
I was wondering how does one compute the log likelihood.

The current implementation I have is as follows,
which perform really bad in learning the mixture models.

__BEGIN__
 # compute probably density of lognormal.
 dens <- function(lambda, theta, k){
        temp<-NULL

        meanl=theta[1:k]
        sdl=theta[(k+1):(2*k)]

        for(j in 1:k){
          # each being lognormal distribution
          temp=cbind(temp,dlnorm(x,meanlog=meanl[j],sdlog=sdl[j]))
        }

        temp=t(lambda*t(temp))
        temp
        }

  old.obs.ll <- sum(log(apply(dens(lambda, theta, k),1,sum)))

  # this is prior likelihood
  lognorm.ll <- function(theta, z,lambda, k) -
sum(z*log(dens(lambda,theta,k)))

__END__

It is based on a slight modification of our earlier
Gamma version, which works really well. The full
code of Gamma version can be found here:

http://dpaste.com/65353/plain/

-G.V.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to