On Apr 25, 2011, at 5:41 PM, Adrian wrote: > Sorry for the wait, this weekend got a little crazy.. Here is a tar/ > zipped Turbogears project I created: http://a-p-w.com/tarfile-test.tar.gz > > I did the simplest possible thing "paster quickstart" and followed the > directions. Then I added a function to 'root.py' called 'returnImages' > that just tars up 3 files (that I included in the tarball) and returns > the bytestream for the tar file. I just tried this on my machine - I > downloaded the file several times and watched the memory usage climb > accordingly. It's set up to run on localhost:8080, so the page to go > to is http://localhost:8080/returnImages
After fixing a missing import, I can run the example. But I don't see a leak. The memory usage is increasing, and nears 1GB at times. But it always falls back to some base-line, around 600MB. I re-did the experiment with TG20. Two observations: - the memory baseline was roughly half of what TG21 created. - the app seemed to run *much* faster I don't have explanations for either of the phenomena. On a probably related, but more general note: Just because Python collects garbage, and might even return memory to the OS, the memory consumption is not going down visibly often. The reason is that the OS may well chose to assign the peak mem to the process, because it's avoiding offset costs for memory page assignment and some funky other stuff OSses do. Another observation: the memory consumption was massively reduced, when I just didn't return the actual data. So there might in fact be some lazy and eventually gc'ed data-structure responsible for this hi peaking memory, and it might well be worth examining that. But it's not a leak. Diez -- You received this message because you are subscribed to the Google Groups "TurboGears" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/turbogears?hl=en.

