Antoine Pitrou <[EMAIL PROTECTED]> added the comment:

The problem with choosing a sensible freelist size is that we don't have
any reference workloads. However, I just tested with 10000 and it
doesn't seem to slow anything down anyway. It doesn't make our
microbenchmarks

I thought the patch to compact freelists at each full gc collection had
been committed, but it doesn't seem there. Perhaps it will change
matters quite a bit. On the one hand, it will allow for bigger freelists
with less worries of degrading memory footprint (but still, potential
cache pollution). On the other hand, the bigger the freelists, the more
expensive it is to deallocate them.

__________________________________
Tracker <[EMAIL PROTECTED]>
<http://bugs.python.org/issue2013>
__________________________________
_______________________________________________
Python-bugs-list mailing list 
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to