I am not sure that K-SVD is any better, but one of the other people in my
group proposed using K-SVD, so I will likely have to implement it. It will
be worth a comparison - I will try and stick to the sklearn API.
On Dec 13, 2013 2:02 PM, "Vlad Niculae" wrote:
> We did not implement K-SVD becaus
We did not implement K-SVD because we did not find any motivation for
having two competing dictionary learning implementations, so we stuck
with the Julien Mairal et al solver. Do you think that K-SVD would do
better than it for this?
Vlad
On Fri, Dec 13, 2013 at 8:46 PM, Kyle Kastner wrote:
>
I have 2 separate approaches I am considering for real-world testing.
For kaggle cats and dogs, using a deep neural network trained on ImageNet
(DeCAF http://arxiv.org/abs/1310.1531) for preprocessing, coupled with any
kind of classifier, has had excellent success for me so far (even logistic
regr
Great, thanks a lot!
I'm also curious about what you're running it on and about how the
performance is.
Vlad
On Fri, Dec 13, 2013 at 7:11 PM, Olivier Grisel
wrote:
> Nice.
>
> Have you used it with success for real image classification tasks?
>
> I see you have been involved in the cats vs dogs
Nice.
Have you used it with success for real image classification tasks?
I see you have been involved in the cats vs dogs kaggle competition.
Is learning a linear model, if so we might consider including the such
KMeansCoder as part of the sklearn.feature_extraction.image module and
write an exam
As a part of some research on dictionary learning, I stumbled across @vene
(and others) closed pull request from many moons ago for K-means dictionary
learning. I wanted to use it as a testing point, so I updated it for
sklearn 0.14 (mixins, some small tweaks to reshaping of stuff in the
example, e
Hi There,
I am working on Instance Reduction (noise removal) on sklearn as my first
contribution (hopefully) and I know a SMOTE implementation would be very
helpful.
Checkout Garcia et all Evolutionary Based IR for imbalanced Datasets. It
concludes that SMOTE oversampling is very helpful for imba
2013/12/13 Eustache DIEMERT :
> Mmmm... if calibration seems to be a good fit for sklearn, I'll try to
> review the different existing approaches and see if it's difficult to
> implement the most useful/popular one(s).
>
> Any hint on that ? is isotonic regression the most used form or should we
>
Do you have any pointer on the relevant literature ? the areas were SMOTE
is deemed useful ?
E/
2013/12/13 abhishek
> hi all,
>
> Is there any PR available on SMOTE ?
> I think it would be good to add SMOTE and ADASYN to scikit-learn and I
> would like to implement them. If there is any PR ava
Mmmm... if calibration seems to be a good fit for sklearn, I'll try to
review the different existing approaches and see if it's difficult to
implement the most useful/popular one(s).
Any hint on that ? is isotonic regression the most used form or should we
have a look on other well-known technique
hi all,
Is there any PR available on SMOTE ?
I think it would be good to add SMOTE and ADASYN to scikit-learn and I
would like to implement them. If there is any PR available, I would like to
start from that. Also, I want to get an opinion of the developers on
whether it would be a nice idea to in
> I don't know if Alex will have time to work on it in the near future.
unlikely... sadly...
any help very welcome.
Alex
--
Rapidly troubleshoot problems before they affect your business. Most IT
organizations don't ha
As far as I know this PR as stalled a bit. I don't know if Alex will
have time to work on it in the near future.
--
Olivier
--
Rapidly troubleshoot problems before they affect your business. Most IT
organizations don't
Hi List,
I need to test calibration for a given problem, but didn't find anything in
sklearn ?
If found a pending PR for isotonic calibration [1] but it seems inactive at
the moment.
Is anyone working on it ? having some gist on one method or another ?
Is there interest ?
Eustache
[1] https:/
On Thu, Dec 12, 2013 at 8:19 PM, Olivier Grisel wrote:
>
> Also for those who were not aware, Gilles, Andreas and Gael attended
> NIPS last week and here is the notebook of the presentation Gilles
> gave at the MLOSS workshop:
>
>
> http://nbviewer.ipython.org/github/glouppe/talk-sklearn-mloss-nip
15 matches
Mail list logo