The link to the survey doesn't work.
Mathieu
--
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
MVC, Windows 8 Apps, JavaScript and much more. Keep your skills current
with LearnDevNow - 3,200 step-by
Great job to all of you :)
Gilles
On 22 January 2013 07:57, Peter Prettenhofer
wrote:
> Great work guys - especially Andy - thanks a lot for making this happen!
>
> best,
> Peter
>
> 2013/1/22 Gael Varoquaux :
>> On Tue, Jan 22, 2013 at 12:02:24AM +0100, Andreas Mueller wrote:
>>> I am very hap
Great work guys - especially Andy - thanks a lot for making this happen!
best,
Peter
2013/1/22 Gael Varoquaux :
> On Tue, Jan 22, 2013 at 12:02:24AM +0100, Andreas Mueller wrote:
>> I am very happy to announce the release of scikit-learn 0.13.
>
> Whoohoo! It's good to have a new version out.
>
On Tue, Jan 22, 2013 at 12:02:24AM +0100, Andreas Mueller wrote:
> I am very happy to announce the release of scikit-learn 0.13.
Whoohoo! It's good to have a new version out.
Thanks Andy for cutting the release, but also for all the coding
and management work during the last month. You rock!
Gae
Great Work and thank you Andreas!
Paolo
On Tue, Jan 22, 2013 at 12:02 AM, Andreas Mueller
wrote:
> Hi all.
> I am very happy to announce the release of scikit-learn 0.13.
> New features in this release include feature hashing for text processing,
> passive-agressive classifiers, faster random f
Congrats to all and special thanks to Andy !
Best Regards,
Wei LI
On Tue, Jan 22, 2013 at 7:04 AM, Jake Vanderplas <
vanderp...@astro.washington.edu> wrote:
> Congrats!
> Thanks for the hard work, Andy
> Jake
>
>
> On 01/21/2013 03:02 PM, Andreas Mueller wrote:
>
> Hi all.
> I am very happy t
Congrats!
Thanks for the hard work, Andy
Jake
On 01/21/2013 03:02 PM, Andreas Mueller wrote:
Hi all.
I am very happy to announce the release of scikit-learn 0.13.
New features in this release include feature hashing for text processing,
passive-agressive classifiers, faster random forests and
Congrats and thanks to Andreas and everyone involved in the release,
the website fixes and the online survey setup.
I posted Andreas blog post on HN and reddit:
- http://news.ycombinator.com/item?id=5094319
-
http://www.reddit.com/r/programming/comments/170oty/scikitlearn_013_is_out_machine_lear
Congratulations! Nice work!
On Jan 21, 2013, at 6:02 PM, Andreas Mueller wrote:
> Hi all.
> I am very happy to announce the release of scikit-learn 0.13.
> New features in this release include feature hashing for text processing,
> passive-agressive classifiers, faster random forests and many mo
congrats. nice job on the release.
On Mon, Jan 21, 2013 at 6:13 PM, bthirion wrote:
> Congratulations, and thank you Andreas for the release management !
>
> Bertrand
>
>
> On 01/22/2013 12:02 AM, Andreas Mueller wrote:
>
> Hi all.
> I am very happy to announce the release of scikit-learn
Congratulations, and thank you Andreas for the release management !
Bertrand
On 01/22/2013 12:02 AM, Andreas Mueller wrote:
Hi all.
I am very happy to announce the release of scikit-learn 0.13.
New features in this release include feature hashing for text processing,
passive-agressive class
woohoo! great work guys!
cheers,
satra
On Mon, Jan 21, 2013 at 6:09 PM, Robert Layton wrote:
> Congrats to all!
>
> On 22 January 2013 10:02, Andreas Mueller wrote:
>
>> Hi all.
>> I am very happy to announce the release of scikit-learn 0.13.
>> New features in this release include feature has
Hi all.
I am very happy to announce the release of scikit-learn 0.13.
New features in this release include feature hashing for text processing,
passive-agressive classifiers, faster random forests and many more.
There have also been countless improvements in stability, consistency and
usability.
2013/1/21 Jake Vanderplas :
> On 01/21/2013 01:13 PM, Lars Buitinck wrote:
>> 2013/1/21 Thomas Dent :
>>> my question was whether the 'distance' function used for weighting returns
>>> the Minkowski distance as defined in Wikipedia, or instead the p-th power
>>> of it.
>> Good question. Judging f
On 01/21/2013 01:13 PM, Lars Buitinck wrote:
> 2013/1/21 Thomas Dent :
>> my question was whether the 'distance' function used for weighting returns
>> the Minkowski distance as defined in Wikipedia, or instead the p-th power of
>> it.
> Good question. Judging from the code, it would seem that on
On 01/21/2013 10:31 PM, Robert Layton wrote:
>
>
> I'm happy with the survey as it stands. Some small comments if there
> is still time:
> - "features" shouldn't be capitalised. It is arguable if "Academic
> Background" should be - but probably not.
> - The text should probably be no wider than t
On 22 January 2013 05:44, Jaques Grobler wrote:
> Here's the PR with online build :
> https://github.com/scikit-learn/scikit-learn/pull/1603
>
>
> 2013/1/21 Jaques Grobler
>
>> My github has been refusing to connect, but seems to be working now..
>> just slowly..
>>
>>
>> 2013/1/21 Andreas Muell
2013/1/21 Thomas Dent :
> my question was whether the 'distance' function used for weighting returns
> the Minkowski distance as defined in Wikipedia, or instead the p-th power of
> it.
Good question. Judging from the code, it would seem that only
Euclidean distance is specialized when the brute
Hi,
my question was whether the 'distance' function used for weighting returns the
Minkowski distance as defined in Wikipedia, or instead the p-th power of it.
For the p=2 which is treated separately I see in the neighbors/base.py source
file (class KNeighborsMixin)
elif self.p ==
Hi all,
I just started using sklearn nearest-neighbors for classification & would like
to apply my own distance weighting function.
To do this I need to know exactly what the 'distance' that is fed to the
function represents. (Current documentation doesn't give me an immediate
answer.)
Fo
Here's the PR with online build :
https://github.com/scikit-learn/scikit-learn/pull/1603
2013/1/21 Jaques Grobler
> My github has been refusing to connect, but seems to be working now.. just
> slowly..
>
>
> 2013/1/21 Andreas Mueller
>
>> Thanks a lot and sorry for rushing you :-/
>>
>>
>> ---
My github has been refusing to connect, but seems to be working now.. just
slowly..
2013/1/21 Andreas Mueller
> Thanks a lot and sorry for rushing you :-/
>
>
> --
> Master Visual Studio, SharePoint, SQL, ASP.NET, C# 20
Thanks a lot and sorry for rushing you :-/
--
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
MVC, Windows 8 Apps, JavaScript and much more. Keep your skills current
with LearnDevNow - 3,200 step-by-st
Hey Andy - on it now.. Sorry I had some immigration issues this morning..
Will make it shortly
J
2013/1/21 Andreas Mueller
> Am 21.01.2013 13:51, schrieb xinfan meng:
> > Should "in the contributors free time" be "in the contributors' free
> > time" ?
> >
> Yes, I think so. Thanks.
>
>
> -
I think SVM does not directly model "Probability" and optimize some
quantity regarding the probability like what logistic regression do. How to
get the probability highly depends on your project(you want high recall or
high precision or some balance over them). In upstream, libsvm does not
seem to
Am 21.01.2013 13:51, schrieb xinfan meng:
> Should "in the contributors free time" be "in the contributors' free
> time" ?
>
Yes, I think so. Thanks.
--
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
Should "in the contributors free time" be "in the contributors' free time" ?
On Mon, Jan 21, 2013 at 8:48 PM, Andreas Mueller
wrote:
> Ok, so I updated the form a bit:
>
> https://docs.google.com/spreadsheet/viewform?formkey=dFdyeGNhMzlCRWZUdldpMEZlZ1B1YkE6MQ#gid=0
> I removed the last question
Am 18.01.2013 15:14, schrieb Lars Buitinck:
> I'm not one for web design, but we usually spell scikit-learn
> all-lowercase, and certainly never with a capital L. Also, I prefer
> "scikit-learn" to "the scikit-learn", but I'm under the impression
> that the French devs on the team prefer the lat
Ok, so I updated the form a bit:
https://docs.google.com/spreadsheet/viewform?formkey=dFdyeGNhMzlCRWZUdldpMEZlZ1B1YkE6MQ#gid=0
I removed the last question and added the possible answers as a hint to
the "what would you like to see" question.
I want to release tonight and I would like to publish t
Hello, I am a new user of scikit-learn and I primarily use 2-class SVM and
1-class SVM. Now, I meet with a problem and hope someone can give me a reply.
I want the 1-class SVM probability output just like the function 2-class SVM
has provided. I know that scikit-learn implementation is based
30 matches
Mail list logo