[Numpy-discussion] next NumPy community meeting
The next NumPy community meeting will be held this Wednesday, July 6th at 18:00 (6 pm) UTC. Join us via Zoom: https://berkeley.zoom.us/j/762261535 Everyone is welcome and encouraged to attend. To add to the meeting agenda the topics you’d like to discuss, follow the link: https://hackmd.io/76o-IxCjQX2mOXO_wwkcpg?both Cheers, Inessa Inessa Pawson Contributor Experience Lead | NumPy https://numpy.org/ Twitter: @inessapawson ___ NumPy-Discussion mailing list -- numpy-discussion@python.org To unsubscribe send an email to numpy-discussion-le...@python.org https://mail.python.org/mailman3/lists/numpy-discussion.python.org/ Member address: arch...@mail-archive.com
[Numpy-discussion] Feature request: dot product along arbitrary axes
Currently there are lots of ways to compute dot products (dot, vdot, inner, tensordot, einsum...), but none of them are really convenient for the case of arrays of vectors, where one dimension (usually the last or the first) is the vector dimension. The simplest way to do this currently is `np.sum(a * b, axis=axis)`, but this makes vector algebra less readable without a wrapper function, and it's probably not optimized as much as matrix products. Another way to do it is by adding appropriate dimensions and using matmul, but that's arguably less readable and not obvious to do generically for arbitrary axes. I think either np.dot or np.vdot could easily be extended with an `axis` parameter that would convert it into a bulk vector operation, with the same semantics as `np.sum(a * b, axis=axis)`. It should also maybe have a `keep_dims` parameter, which is useful for preserving broadcasting. I submitted a corresponding issue at https://github.com/numpy/numpy/issues/21915 ___ NumPy-Discussion mailing list -- numpy-discussion@python.org To unsubscribe send an email to numpy-discussion-le...@python.org https://mail.python.org/mailman3/lists/numpy-discussion.python.org/ Member address: arch...@mail-archive.com
[Numpy-discussion] Re: Feature request: dot product along arbitrary axes
I don't understand. Both theretically and coding wise Matmul is the most readable thing that you can have within those options. That is in fact what the definition is. Can you give an example? On Mon, Jul 4, 2022, 04:49 wrote: > Currently there are lots of ways to compute dot products (dot, vdot, > inner, tensordot, einsum...), but none of them are really convenient for > the case of arrays of vectors, where one dimension (usually the last or the > first) is the vector dimension. The simplest way to do this currently is > `np.sum(a * b, axis=axis)`, but this makes vector algebra less readable > without a wrapper function, and it's probably not optimized as much as > matrix products. Another way to do it is by adding appropriate dimensions > and using matmul, but that's arguably less readable and not obvious to do > generically for arbitrary axes. I think either np.dot or np.vdot could > easily be extended with an `axis` parameter that would convert it into a > bulk vector operation, with the same semantics as `np.sum(a * b, > axis=axis)`. It should also maybe have a `keep_dims` parameter, which is > useful for preserving broadcasting. > > I submitted a corresponding issue at > https://github.com/numpy/numpy/issues/21915 > ___ > NumPy-Discussion mailing list -- numpy-discussion@python.org > To unsubscribe send an email to numpy-discussion-le...@python.org > https://mail.python.org/mailman3/lists/numpy-discussion.python.org/ > Member address: ilhanpo...@gmail.com > ___ NumPy-Discussion mailing list -- numpy-discussion@python.org To unsubscribe send an email to numpy-discussion-le...@python.org https://mail.python.org/mailman3/lists/numpy-discussion.python.org/ Member address: arch...@mail-archive.com