Oh nevermind, I see that this is added as an experimental module in the latest
numpy version. It would be nice to not have to have another whole set of APIs,
but on the other hand the numpy API is so messy and inconsistent that maybe it
is a good thing :) But it does mean now we have at least 9
Yes, if I am doing this more than once in some code I would make a helper. But
it's much better I think to have a common function that people can learn and
use consistently instead of having to roll their own functions all the time.
Especially because numpy otherwise usually just works when you
I'm unaware of the context here, is this a specification for functions that it
is hoped will eventually be made consistent across numpy/tensorflow/etc? If
that's the idea then yeah, I'm all for it, but I would suggest also adding a
keepdim parameter (as I mentioned above it helps with broadcasti
Maybe I wasn't clear, I'm talking about the 1-dimensional vector product, but
applied to N-D arrays of vectors. Certainly dot products can be realized as
matrix products, and often are in mathematics for convenience, but matrices and
vectors are not the same thing, theoretically or coding wise.
Currently there are lots of ways to compute dot products (dot, vdot, inner,
tensordot, einsum...), but none of them are really convenient for the case of
arrays of vectors, where one dimension (usually the last or the first) is the
vector dimension. The simplest way to do this currently is `np.s