Ronnie Ghose writes:
>
>
> do you think it isn't saving correctly or it isn't loading correctly?
>
> On Thu, Jan 24, 2013 at 8:14 PM, Ark wrote:
> Gael Varoquaux ...> writes:
> >
> > On Wed, Jan 23, 2013 at 12:16:32AM +, Afik Cohen wrote:
&g
Gael Varoquaux writes:
>
> On Tue, Jan 22, 2013 at 10:30:01PM +, Ark wrote:
> > /home/n7/newenv/lib/python2.6/site-
> > packages/sklearn/externals/joblib/numpy_pickle.pyc in
read_zfile(file_handle)
> > 69 assert len(data) == length, (
> > 70 "Incorrect data length whil
> > Will this let us run SGDClassifier and show us per-class probability
outputs?
> > Again, that's the only reason we've been using OneVsRestClassifier. Let me
> > explain what I mean by per-class probability, just in case it isn't clear:
> >
> > SGDClassifier's predict_proba() returns probabilit
Andreas Mueller writes:
>
> On 12/03/2012 09:39 PM, Afik Cohen wrote:
> > No, we aren't doing multi-label classification, just multiclass. He was
saying
> > we could just use SGDClassifier directly, which is true, but AFAIK there is
no
> > way to get good predic
Andreas Mueller writes:
>
> Am 29.11.2012 23:45, schrieb Afik Cohen:
> >
> >
> > Hey Mathieu!
> >
> > Pretty much the only reason we wrap SGDClassifier in a OneVsRestClassifier
is
so
> > we can get predict_proba results on a per class basis. This
Mathieu Blondel writes:
>
>
> On Thu, Nov 29, 2012 at 10:39 AM, Afik Cohen wrote:
>
> It's easy to see how with some slight modifications (wrapping that in a joblib
> Parallel() call) we could enable n_jobs for OneVsRestClassifier. This almost
> seems too simpl
, so there must be a good reason why this isn't done; could
you give your opinions on this?
Thanks,
Afik Cohen
--
Keep yourself connected to Go Parallel:
VERIFY Te
* Woops, my previous reply got munged up, so I'm resubmitting it. Please
ignore my previous messed up email.
> 2012/10/30 Afik Cohen :
> >> Do you know what they are doing? I would expect they just do a soft-max.
> > I don't. :) But according to the LIBLINEAR FAQ:
Hi Lars,
Thanks for your reply.
>
> 2012/10/30 Afik Cohen :
> >> Do you know what they are doing? I would expect they just do a soft-max.
> > I don't. :) But according to the LIBLINEAR FAQ: "If you really would like to
> > have probability outputs for SVM
or something, not classifying inputs to discrete classes,
right? We're classifying emails into ~1200 distinct classes, so Logistic
Regression is meaningless for us (in fact, when we tried it, it achieved a
hilarious 48% cross-validated k=3 accuracy. LinearSVC achieves 95% accuracy.)
Thanks again.
o continue doing so with
current and future versions of scikit!
Thanks,
Afik Cohen
Abhijeet Kolhe
LinearSVC prediction probability patch follows:
diff --git a/sklearn/svm/classes.py b/sklearn/svm/classes.py
index 79cb76d..d432792 100644
--- a/sklearn/svm/classes.py
+++ b/sklearn/svm/classes.py
@@ -1
11 matches
Mail list logo