On Mon, Nov 14, 2011 at 12:46 PM, Andreas Müller
<amuel...@ais.uni-bonn.de> wrote:
> Hi everybody.
> When I did some normalization using numpy, I noticed that numpy.std uses
> more ram than I was expecting.
> A quick google search gave me this:
> http://luispedro.org/software/ncreduce
> The site claims that std and other reduce operations are implemented
> naively with many temporaries.
> Is that true? And if so, is there a particular reason for that?
> This issues seems quite easy to fix.
> In particular the link I gave above provides code.

The code provided only implements a few special cases: being more
efficient in those cases only is indeed easy.

cheers,

David
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to