On Wed, 21 Mar 2007 15:03:17 +0000, Tom Wright wrote: [snip]
> Ah, thanks for explaining that. I'm a little wiser about memory allocation > now, but am still having problems reclaiming memory from unused objects > within Python. If I do the following: > >>>> > (memory use: 15 MB) >>>> a = range(int(4e7)) > (memory use: 1256 MB) >>>> a = None > (memory use: 953 MB) > > ...and then I allocate a lot of memory in another process (eg. open a load > of files in the GIMP), then the computer swaps the Python process out to > disk to free up the necessary space. Python's memory use is still reported > as 953 MB, even though nothing like that amount of space is needed. Who says it isn't needed? Just because *you* have only one object existing, doesn't mean the Python environment has only one object existing. > From what you said above, the problem is in the underlying C libraries, What problem? Nothing you've described seems like a problem to me. It sounds like a modern, 21st century operating system and programming language working like they should. Why do you think this is a problem? You've described an extremely artificial set of circumstances: you create 40,000,000 distinct integers, then immediately destroy them. The obvious solution to that "problem" of Python caching millions of integers you don't need is not to create them in the first place. In real code, the chances are that if you created 4e7 distinct integers you'll probably need them again -- hence the cache. So what's your actual problem that you are trying to solve? > but is there anything I can do to get that memory back without closing > Python? Why do you want to manage memory yourself anyway? It seems like a horrible, horrible waste to use a language designed to manage memory for you, then insist on over-riding it's memory management. I'm not saying that there is never any good reason for fine control of the Python environment, but this doesn't look like one to me. -- Steven. -- http://mail.python.org/mailman/listinfo/python-list