Stefano Mazzocchi <[EMAIL PROTECTED]> wrote:

> NOTE: this is a refactoring of an email I wrote 2 1/2 years 
> ago. 

<snip on entire contents/>

Stefano,  

had a little more time to read this last week:

Memphis got hit by some 85 mph winds, no power at our house for a week
yet, but I have a printed copy...

(See
http://slideshow.gomemphis.com/slideshow.cfm?type=GOMEMPHIS&ID=stormpics
&NUM=8 for an idea of what we got hit with; as of writing this slides
50, 51, 57 are from just down the block. We had one fence broken, some
torn up screens, one air conditioner damaged, but amazingly no other
damage to our house.)

The little reading I fit in between moving tree branches off our
property  triggered more ancient memories on paging algorithms.  Instead
of trying to dig up my old notes and books I did a Goggle on "paging
algorithm" and found lots of good references that I think may be of some
use.  In particular note:

        http://www.cs.duke.edu/~jsv/Papers/catalog/node93.html

Essentially, you can treat Cocoon as having multiple applications
(pipelines) producing working sets (SAX streams) and directly apply the
tons of research done in this area.  You do have to introduce the
concept of "page size", but this can be arbitrarily small or perhaps
using some (average?) OS page size might be reasonable?

If you do the Google search you'll notice the references to randomized
paging algorithms.   I didn't chase these very far other than to
determine that at least one author shows that they can perform as good
as conventional algorithms but not as good as the theoretical best.


Reply via email to