Hi everybody. When I did some normalization using numpy, I noticed that numpy.std uses more ram than I was expecting. A quick google search gave me this: http://luispedro.org/software/ncreduce The site claims that std and other reduce operations are implemented naively with many temporaries. Is that true? And if so, is there a particular reason for that? This issues seems quite easy to fix. In particular the link I gave above provides code. Cheers, Andy _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
- [Numpy-discussion] Memory hungry reduce ops in Numpy Andreas Müller
- Re: [Numpy-discussion] Memory hungry reduce ops in N... David Cournapeau
- Re: [Numpy-discussion] Memory hungry reduce ops ... Andreas Müller
- Re: [Numpy-discussion] Memory hungry reduce ... Bruce Southey
- Re: [Numpy-discussion] Memory hungry red... Andreas Müller
- Re: [Numpy-discussion] Memory hungr... Andreas Müller
- Re: [Numpy-discussion] Memory h... Warren Weckesser
- Re: [Numpy-discussion] Memo... Andreas Müller
- Re: [Numpy-discussion] Memory hungr... Gael Varoquaux
- Re: [Numpy-discussion] Memory h... Robert Kern
- Re: [Numpy-discussion] Memo... Gael Varoquaux