On Sat, Jun 30, 2018 at 12:14 PM Stephan Hoyer wrote:
> I’d love to see a generic way of doing random number generation, but I
> agree with Martin that I don’t see it fitting a naturally into this NEP. An
> invasive change to add an array_reference argument to a bunch of functions
> might indeed
On Sat, Jun 30, 2018 at 11:59 AM Hameer Abbasi
wrote:
> Hi Marten,
>
> Still, I'm not sure whether this should be included in the present NEP or
> is best done separately after, with a few concrete examples of where it
> would be useful.
>
>
> There already are concrete examples from Dask and CuP
Hi Marten,
Still, I'm not sure whether this should be included in the present NEP or
is best done separately after, with a few concrete examples of where it
would be useful.
There already are concrete examples from Dask and CuPy, and this is
currently a blocker for them, which is part of the rea
Hi Hameer,
I think the override on `dtype` would work - after all, the override is
checked before anything is done, so one can just pass in `self` if one
wishes (or some helper class that contains both `self` and any desired
further information.
But, as you note, it would not cover everything, an
Hi Marten,
Sorry, I had clearly misunderstood. It would indeed be nice for overrides
to work on functions like `zeros` or `arange` as well, but it seems strange
to change the signature just for that. As a possible alternative, should we
perhaps generally check for overrides on `dtype`?
While thi
Hi Hameer,
It is. The point of the proposed feature was to handle array generation
> mechanisms, that don't take an array as input in the standard NumPy API.
> Giving them a reference handles both the dispatch and the decision about
> which implementation to call.
>
Sorry, I had clearly misunder
On Fri, Jun 29, 2018 at 9:54 PM, Eric Wieser
wrote:
> Good catch,
>
> I think the latter failing is because np.add.reduce ends up calling
> np.ufunc.reduce.__get__(np.add), and builtin_function.__get__ doesn’t
> appear to do any caching. I suppose caching bound methods would just be a
> waste of
Good catch,
I think the latter failing is because np.add.reduce ends up calling
np.ufunc.reduce.__get__(np.add), and builtin_function.__get__ doesn’t
appear to do any caching. I suppose caching bound methods would just be a
waste of time.
== would work just fine in my suggestion above, it seems -
On Thu, Jun 28, 2018 at 5:36 PM Eric Wieser
wrote:
> Another option would be to directly compare the methods against known ones:
>
> obj = func.__self__
> if isinstance(obj, np.ufunc):
> if func is obj.reduce:
> got_reduction()
>
> I'm not quite sure why, but this doesn't seem to work
Hi Martin,
It is. The point of the proposed feature was to handle array generation
mechanisms, that don't take an array as input in the standard NumPy API.
Giving them a reference handles both the dispatch and the decision about
which implementation to call.
I'm confused: Isn't your reference arr
On 28/06/18 17:18, Stephan Hoyer wrote:
On Thu, Jun 28, 2018 at 1:12 PM Marten van Kerkwijk
mailto:m.h.vankerkw...@gmail.com>> wrote:
For C classes like the ufuncs, it seems `__self__` is defined for
methods as well (at least, `np.add.reduce.__self__` gives
`np.add`), but not a `
Another option would be to directly compare the methods against known ones:
obj = func.__self__
if isinstance(obj, np.ufunc):
if func is obj.reduce:
got_reduction()
Eric
On Thu, 28 Jun 2018 at 17:19 Stephan Hoyer wrote:
> On Thu, Jun 28, 2018 at 1:12 PM Marten van Kerkwijk <
> m.
On Thu, Jun 28, 2018 at 1:12 PM Marten van Kerkwijk <
m.h.vankerkw...@gmail.com> wrote:
> For C classes like the ufuncs, it seems `__self__` is defined for methods
> as well (at least, `np.add.reduce.__self__` gives `np.add`), but not a
> `__func__`. There is a `__name__` (="reduce"), though, whic
> I did a little more digging, and turned up the __self__ and __func__
> attributes of bound methods:
> https://stackoverflow.com/questions/4679592/how-to-find-
> instance-of-a-bound-method-in-python
>
> So we might need another decorator function, but it seems that the current
> interface would ac
On Wed, Jun 27, 2018 at 12:50 PM Stephan Hoyer wrote:
> One concern this does raise is how to handle methods like those on
> RandomState, even though methods like random_like() don't currently exist.
> Distribution objects from scipy.stats could have similar use cases.
>
> So perhaps it's worth "
I think the usefulness of this feature is actually needed. Consider
`np.random.RandomState`. If we were to add what I proposed, the two could
work very nicely to (for example) do things like creating Dask random
arrays, from RandomState objects.
For reproducibility, Dask could generate multiple Ra
On Wed, Jun 27, 2018 at 3:50 PM, Stephan Hoyer wrote:
> So perhaps it's worth "future proofing" the interface by passing `obj` and
> `method` to __array_function__ rather than only `func`. It is slower to
> call a func via func.__call__ than func, but only very marginally (~100 ns
> in my tes
On Tue, Jun 26, 2018 at 11:27 PM Hameer Abbasi
wrote:
> I would like to propose that we use `__array_function` in the following
> manner for functions that create arrays:
>
>- `array_reference` for indicating the “reference array” whose
>`__array_function__` implementation will be called.
Hi Hameer,
I'm confused: Isn't your reference array just `self`?
All the best,
Marten
On Wed, Jun 27, 2018 at 2:27 AM, Hameer Abbasi
wrote:
>
>
> On 27. Jun 2018 at 07:48, Stephan Hoyer wrote:
>
>
> After much discussion (and the addition of three new co-authors!), I’m
> pleased to present a
On 27. Jun 2018 at 07:48, Stephan Hoyer wrote:
After much discussion (and the addition of three new co-authors!), I’m
pleased to present a significantly revision of NumPy Enhancement Proposal
18: A dispatch mechanism for NumPy's high level array functions:
http://www.numpy.org/neps/nep-0018-arra
20 matches
Mail list logo