On 29-Apr-09, at 5:49 PM, Dan Goodman wrote: > Thanks David, that's nice but unfortunately that Python loop will kill > me. I'm thinking about some simulation code I'm writing where this > operation will be carried out many, many times, with large arrays I. I > figure I need to keep the Python overheads to a fixed cost to get good > performance.
I see. Well, keep in mind that the loop only scales in the number of unique elements in I, rather than the total number of elements. This might make it much less costly than you might think depending on the typical distribution of elements in I. Have you considered coding up a looped version in Cython? If this is going to be a bottleneck then it would be very worthwhile. Stéfan's code is clever, although as he points out, it will create an intermediate array of size (len(I))**2, which may end up being as much of a problem as a Python loop if you're allocating and garbage collecting an N**2 array every time. David _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion