Re: [Numpy-discussion] Optimized sum of squares

2009-10-22 Thread Gary Ruben
josef.p...@gmail.com wrote: > Is it really possible to get the same as np.sum(a*a, axis) with > tensordot if a.ndim=2 ? > Any way I try the "something_else", I get extra terms as in np.dot(a.T, a) Just to answer this question, np.dot(a,a) is equivalent to np.tensordot(a,a, axis=(0,0)) but the l

Re: [Numpy-discussion] Optimized sum of squares

2009-10-20 Thread Anne Archibald
2009/10/20 : > On Tue, Oct 20, 2009 at 3:09 PM, Anne Archibald > wrote: >> 2009/10/20 : >>> On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben wrote: Hi Gaël, If you've got a 1D array/vector called "a", I think the normal idiom is np.dot(a,a) For the more general cas

Re: [Numpy-discussion] Optimized sum of squares

2009-10-20 Thread josef . pktd
On Tue, Oct 20, 2009 at 3:09 PM, Anne Archibald wrote: > 2009/10/20  : >> On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben wrote: >>> Hi Gaël, >>> >>> If you've got a 1D array/vector called "a", I think the normal idiom is >>> >>> np.dot(a,a) >>> >>> For the more general case, I think >>> np.tensordot

Re: [Numpy-discussion] Optimized sum of squares

2009-10-20 Thread Anne Archibald
2009/10/20 : > On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben wrote: >> Hi Gaël, >> >> If you've got a 1D array/vector called "a", I think the normal idiom is >> >> np.dot(a,a) >> >> For the more general case, I think >> np.tensordot(a, a, axes=something_else) >> should do it, where you should be ab

Re: [Numpy-discussion] Optimized sum of squares

2009-10-20 Thread josef . pktd
On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben wrote: > Hi Gaël, > > If you've got a 1D array/vector called "a", I think the normal idiom is > > np.dot(a,a) > > For the more general case, I think > np.tensordot(a, a, axes=something_else) > should do it, where you should be able to figure out somethin

Re: [Numpy-discussion] Optimized sum of squares

2009-10-18 Thread Charles R Harris
On Sun, Oct 18, 2009 at 11:37 AM, wrote: > On Sun, Oct 18, 2009 at 12:06 PM, Skipper Seabold > wrote: > > On Sun, Oct 18, 2009 at 8:09 AM, Gael Varoquaux > > wrote: > >> On Sun, Oct 18, 2009 at 09:06:15PM +1100, Gary Ruben wrote: > >>> Hi Gaël, > >> > >>> If you've got a 1D array/vector called

Re: [Numpy-discussion] Optimized sum of squares

2009-10-18 Thread Sturla Molden
Skipper Seabold skrev: > I'm curious about this as I use ss, which is just np.sum(a*a, axis), > in statsmodels and didn't much think about it. > > Do the number of loops matter in the timings and is dot always faster > even without the blas dot? > The thing is that a*a returns a temporary array

Re: [Numpy-discussion] Optimized sum of squares

2009-10-18 Thread josef . pktd
On Sun, Oct 18, 2009 at 12:06 PM, Skipper Seabold wrote: > On Sun, Oct 18, 2009 at 8:09 AM, Gael Varoquaux > wrote: >> On Sun, Oct 18, 2009 at 09:06:15PM +1100, Gary Ruben wrote: >>> Hi Gaël, >> >>> If you've got a 1D array/vector called "a", I think the normal idiom is >> >>> np.dot(a,a) >> >>>

Re: [Numpy-discussion] Optimized sum of squares

2009-10-18 Thread Skipper Seabold
On Sun, Oct 18, 2009 at 8:09 AM, Gael Varoquaux wrote: > On Sun, Oct 18, 2009 at 09:06:15PM +1100, Gary Ruben wrote: >> Hi Gaël, > >> If you've got a 1D array/vector called "a", I think the normal idiom is > >> np.dot(a,a) > >> For the more general case, I think >> np.tensordot(a, a, axes=somethin

Re: [Numpy-discussion] Optimized sum of squares

2009-10-18 Thread Gael Varoquaux
On Sun, Oct 18, 2009 at 09:06:15PM +1100, Gary Ruben wrote: > Hi Gaël, > If you've got a 1D array/vector called "a", I think the normal idiom is > np.dot(a,a) > For the more general case, I think > np.tensordot(a, a, axes=something_else) > should do it, where you should be able to figure out som

Re: [Numpy-discussion] Optimized sum of squares

2009-10-18 Thread Gary Ruben
Hi Gaël, If you've got a 1D array/vector called "a", I think the normal idiom is np.dot(a,a) For the more general case, I think np.tensordot(a, a, axes=something_else) should do it, where you should be able to figure out something_else for your particular case. Gary R. Gael Varoquaux wrote: >

[Numpy-discussion] Optimized sum of squares (was: vectorized version of logsumexp? (from scipy.maxentropy))

2009-10-18 Thread Gael Varoquaux
On Sat, Oct 17, 2009 at 07:27:55PM -0400, josef.p...@gmail.com wrote: > >> > Why aren't you using logaddexp ufunc from numpy? > >> Maybe because it is difficult to find, it doesn't have its own docs entry. Speaking of which... I thought that there was a readily-written, optimized function (or uf