Oh, I guessed your reasons. Lanugages like python make a lot of garbage, so 16MB will fill up pretty fast as long as the program does something at all. What I mean to say is that there's gotta be a more clever way where gc thresholds depend on e.g. size of working set or rate of new allocations or something yet smarter.
d. On 5 April 2011 12:44, Armin Rigo <ar...@tunes.org> wrote: > Hi Dima, > > On Tue, Apr 5, 2011 at 9:26 PM, Dima Tisnek <dim...@gmail.com> wrote: >> What I'm trying to say, is gc should adapt to run-time behaviour of a >> particular script, in some cases 16MB heap threshold would impact both >> user expectation and performance significantly. > > It may impact user expectation and performance, but: > > User expectation: so far we've considered the "desktop" users, which > do not care about 10MB versus 20MB but start to care when it's about > 300MB versus 600MB. It's true that other categories of users exist. > Sorry for not being able to care for all possible use cases at once > :-) > > Performance: actually this setting of ours -- not collecting before we > have at least 16MB of data -- was done to avoid degradation of > performance in cases where the Python script has really low memory > usage, so the idea "it's consuming 30MB instead of 5MB so it must have > a terrible performance" sounds pretty abstract. But again this is > assuming a system where 30MB is a small fraction of the total amount > of RAM. > > If you want to care about the use case of, say, systems with 16MB of > RAM in total, then feel free to tweak PyPy. I suppose that you'd get > the best results by tweaking one of our GCs, or writing a new one. > (For example it probably doesn't make much sense to have a nursery at > all.) > > > I hope this helps to clarify the issue, > > Armin > _______________________________________________ pypy-dev@codespeak.net http://codespeak.net/mailman/listinfo/pypy-dev