On 10/18/2012 09:10 PM, Martin Fergie wrote:
> Hi Didier,
>
> I have found that the normal GMM learnt using EM is fine:
>
> http://scikit-learn.org/stable/modules/mixture.html#gmm-classifier
>
> It's the variational and Dirichlet versions that are incorrect!
>
Yes, I think this is the case.
I am n
t: October 18, 2012 10/18/12
> To: scikit-learn-general@lists.sourceforge.net
> Subject: Re: [Scikit-learn-general] sklearn.mixture.DPGMM: Unexpected
> results
>
>Hi Aron.
> I think this might be an instance of this bug:
> https://github.com/scikit-learn/scikit-learn/issues
Joseph,
My understanding is that the n_components in DPGMM is an upper-bound. I've
gotten similarly bad results for n_components=2.
-Aron
From:
Reply-To:
Date: Thursday, October 18, 2012 1:32 PM
To:
Subject: Re: [Scikit-learn-general] sklearn.mixture.DPGMM: Unexpected
results
>
On Thu, Oct 18, 2012 at 1:57 PM, Aron Culotta wrote:
> The results I get from DPGMM are not what I expect. E.g.:
>
import sklearn.mixture
sklearn.__version__
> '0.12-git'
data = [[1.1],[0.9],[1.0],[1.2],[1.0], [6.0],[6.1],[6.1]]
m = sklearn.mixture.DPGMM(n_components=5, n_iter=
To: scikit-learn-general@lists.sourceforge.net
Subject: Re: [Scikit-learn-general] sklearn.mixture.DPGMM: Unexpected results
Hi Aron.
I think this might be an instance of this bug:
https://github.com/scikit-learn/scikit-learn/issues/393
Unfortunately this part of the scikit is in a very bad
Andy,
Thanks for your quick reply. Yes, throwing some type of warning would
probably be good while the code is being revamped.
Best,
Aron
> On 10/18/2012 06:57 PM, Andreas Mueller wrote:
>
> Hi Aron.
> I think this might be an instance of this bug:
> https://github.com/scikit-learn/scikit-learn/is
Hi Aron.
I think this might be an instance of this bug:
https://github.com/scikit-learn/scikit-learn/issues/393
Unfortunately this part of the scikit is in a very bad state.
Sorry for making you wonder.
I have been thinking about putting in a user warning earlier today.
What do others think?
This
The results I get from DPGMM are not what I expect. E.g.:
>>> import sklearn.mixture
>>> sklearn.__version__
'0.12-git'
>>> data = [[1.1],[0.9],[1.0],[1.2],[1.0], [6.0],[6.1],[6.1]]
>>> m = sklearn.mixture.DPGMM(n_components=5, n_iter=1000, alpha=1)
>>> m.fit(data)
DPGMM(alpha=1, covariance_type='d