Hi all,
In the context of memory profiling an application (with memory_profiler
module) we came up a strange behaviour in numpy, see for yourselves:
Line #Mem usageIncrement Line Contents
29 @profile
On Thu, May 16, 2013 at 8:35 AM, Martin Raspaud martin.rasp...@smhi.se wrote:
Hi all,
In the context of memory profiling an application (with memory_profiler
module) we came up a strange behaviour in numpy, see for yourselves:
Line #Mem usageIncrement Line Contents
On 16/05/13 10:26, Robert Kern wrote:
Can anyone give a reasonable explanation ?
memory_profiler only looks at the amount of memory that the OS has
allocated to the Python process. It cannot measure the amount of
memory actually given to living objects. Python does not always return
memory
On Thu, May 16, 2013 at 1:32 PM, Martin Raspaud martin.rasp...@smhi.se wrote:
On 16/05/13 10:26, Robert Kern wrote:
Can anyone give a reasonable explanation ?
memory_profiler only looks at the amount of memory that the OS has
allocated to the Python process. It cannot measure the amount of
Hi everyone,
(this was posted as part of another topic, but since it was unrelated,
I'm reposting as a separate thread)
I've also been having issues with __array_priority__ - the following
code behaves differently for __mul__ and __rmul__:
import numpy as np
class TestClass(object):
def
On Thu, May 16, 2013 at 6:09 PM, Phillip Feldman
phillip.m.feld...@gmail.com wrote:
It seems odd that `nanmin` and `nanmax` are in NumPy, while `nanmean` is
in SciPy.stats. I'd like to propose that a `nanmean` function be added to
NumPy.
Have no fear. There is already plans for its