On Mon, Jun 8, 2009 at 8:55 AM, David Cournapeau < da...@ar.media.kyoto-u.ac.jp> wrote:
> I think it depends on what you are doing - EM is used for 'real' work > too, after all :) Certainly, but EM is really just a mediocre gradient descent/hill climbing algorithm that is relatively easy to implement. Thanks for the link, I was not aware of this work. What is the > difference between the ECG method and the method proposed by Lange in > [1] ? To avoid 'local trapping' of the parameter in EM methods, > recursive EM [2] may also be a promising method, also it seems to me > that it has not been used so much, but I may well be wrong (I have seen > several people using a simplified version of it without much theoretical > consideration in speech processing). I hung-out in the machine learning community appx. 1999-2007 and thought the Salakhutdinov work was extremely refreshing to see after listening to no end of papers applying EM to whatever was the hot topic at the time. :) I've certainly seen/heard about various fixes to EM, but I haven't seen convincing reason(s) to prefer it over proper gradient descent/hill climbing algorithms (besides its present-ability and ease of implementation). Cheers, Jason -- Jason Rennie Research Scientist, ITA Software 617-714-2645 http://www.itasoftware.com/
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion