> Date: Wed, 29 Oct 2014 14:57:45 +0100
> From: Olivier Grisel
> Subject: Re: [Scikit-learn-general] Fast Johnson-Lindenstrauss
> Transform
> To: scikit-learn-general
> Message-ID:
>
> Content-Type: text/plain; charset=UTF-8
>
> Indeed this is qu
Can you comment a bit how they combine the random sign matrix and the subsample
random subsample fourrier basis?
Best regards,
Arnaud Joly
On 29 Oct 2014, at 14:24, Michal Romaniuk
wrote:
> Hi everyone,
>
> I'm thinking of adding the Unrestricted Fast Johnson-Lindenstrauss
> Transform [1] to
Indeed this is quite a new method and we have a policy to wait a bit
to see if it's actually practically useful before including an
implementation in the code base.
Michal, if you have replicated the results of the paper in Python it
would be interesting to publish your code in a scikit-learn styl
It would be nice to have it implemented in a
sklearn.random_projections-compatible form, but is there reason to believe
it is stable/popular enough for inclusion in the repo?
On 30 October 2014 00:24, Michal Romaniuk
wrote:
> Hi everyone,
>
> I'm thinking of adding the Unrestricted Fast Johnson-
Hi everyone,
I'm thinking of adding the Unrestricted Fast Johnson-Lindenstrauss
Transform [1] to the random_projections module and would like to ask if
maybe someone is already working on this.
(If you know of a competing algorithm that would be worth looking at,
please let me know ;))
Thanks,
M