Thanks all for you constructive comments,
I think your right. In fact, it was also my impression that a QA dedicated
to sklearn is a little bit overkill, but I convinced myself that there is
never to much communication channels. That said, I think your right,
sklearn will have better visibility if
2012/2/14 Alexandre Passos :
> On Mon, Feb 13, 2012 at 17:30, Gael Varoquaux
> wrote:
>> Hi Martin,
>>
>> Thanks for you enthousiasm, and thanks for the Crowdbase instance. The
>> Q&A sites are definitively very useful. I am myself an occasional user of
>> a few of them. Scikit-learn is actually v
Hi folks,
[ I'm broadcasting this widely for maximum reach, but I'd appreciate
it if replies can be kept to the *numpy* list, which is sort of the
'base' list for scientific/numerical work. It will make it much
easier to organize a coherent set of notes later on. Apology if
you're subscribed to
On Mon, Feb 13, 2012 at 17:30, Gael Varoquaux
wrote:
> Hi Martin,
>
> Thanks for you enthousiasm, and thanks for the Crowdbase instance. The
> Q&A sites are definitively very useful. I am myself an occasional user of
> a few of them. Scikit-learn is actually very well represented on the main
> Q&A
> At the moment I'm not sure when and how the "C" values
> are scaled by the number of examples.
in the fit method before calling the low level libsvm/liblinear bindings.
to check you can try with scale_C=False
Alex
--
Hi Martin,
Thanks for you enthousiasm, and thanks for the Crowdbase instance. The
Q&A sites are definitively very useful. I am myself an occasional user of
a few of them. Scikit-learn is actually very well represented on the main
Q&A site for machine learn, http://metaoptimize.com/qa.
With regard
On 02/13/2012 09:49 PM, Lars Buitinck wrote:
> Hi all,
>
> After reading some papers on (approximate) polynomial kernels for NLP
> applications, I got curious and decided to do some quick experiments.
> I modified the 20 newsgroups example to benchmark vanilla SVC instead
> of LinearSVC with linear
2012/2/13 Lars Buitinck :
> Hi all,
>
> After reading some papers on (approximate) polynomial kernels for NLP
> applications, I got curious and decided to do some quick experiments.
> I modified the 20 newsgroups example to benchmark vanilla SVC instead
> of LinearSVC with linear, quadratic and cub
Hi all,
After reading some papers on (approximate) polynomial kernels for NLP
applications, I got curious and decided to do some quick experiments.
I modified the 20 newsgroups example to benchmark vanilla SVC instead
of LinearSVC with linear, quadratic and cubic kernels. I was quite
surprised at
Hi All,
I read a lot of really interesting discussion on the sklearn mailinglist
lately. Unfortunately, it is hard to find specific answers in the archive,
so I started an new Crowdbase instance (a Q&A tool ... a little bit like
Stackoverflow) specifically for the sklearn developers and users. Fee
10 matches
Mail list logo