On 5/21/15 10:18 AM, "=?UTF-8?B?Ik3DoXJjaW8=?= Martins\" <marcio...@gmail.com>\"" wrote:
On Thursday, 21 May 2015 at 15:17:03 UTC, Andrei Alexandrescu wrote:
[snip]

Could something like this work for the statistics part?

enum AdaptabilityWindow    = 256;
enum AdaptabilitySpeed    = 2; // log2 of rescaling divisor


     if (hit) ++hitCount;
     else ++missCount;

     if ((hitCount + missCount) >= Adaptability) {
         hitCount >>= AdaptabilitySpeed;
         missCount >>= AdaptabilitySpeed;
     }

     // hitCount/(hitCount + missCount) and missCount / (hitCount +
missCount) are a decent approximation of p(hit) and p(miss) given recent
history
     // if you can make decisions just by comparing them then you don't
have to perform expensive divisions

This is based on re-scaling frequencies, similar to what you'd find in
model-based compression algorithms.

With it you could try to always keep a minimum reserve of blocks just in
case, and when freeing a block of fit size make the decision about
keeping it in the free-list if the tendency or returning it to the
parent if the tendency is to allocate bad sizes.

To incrementally reduce the big free-list, you could just reduce the
free-list size by a factor proportional to p(miss) - p(hit)

Yah, I think that's a good idea. The size of the generated code was getting worrisome. Thanks! -- Andrei

Reply via email to