(hello group)

On Nov 9, 8:38 pm, "Klaas" <[EMAIL PROTECTED]> wrote:

> I was referring specifically to abominations like range(1000000)

However, there are plenty of valid reasons to allocate huge lists of
integers.   This issue has been worked on:
http://evanjones.ca/python-memory.html
http://evanjones.ca/python-memory-part3.html

My understanding is that the patch allows most objects to be released
back to the OS, but can't help the problem for integers.  I could be
mistaken.  But on a clean Python 2.5:

x=range(10000000)
x=None

The problem exists for floats too, so for a less contrived example:

x=[random.weibullvariate(7.0,2.0) for i in xrange(10000000)]
x=None

Both leave the Python process bloated in my environment.   Is this
problem a good candidate for the FAQ?

 --Joseph

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to