The FLOG Index wrote:

> Something along the lines of taking 180 megabytes (or what ever the 
> media size was) worth of keys from the datastore, at random, and 
> putting them on the media. When you give the media to someone else,
> they import the keys (also possibly randomly if it does not make sense
> to import the entire cd). This would mix things up and give a person
> running a node even more plausable denibility that they did not 
> request the content they are hosting - this might even work in the 
> case of a transient node, if they participate in sneaker net their 
> node will be full of all kinds of stuff they never requested.
>
> This is where the garbage collection of a fragile media comes into 
> play - if the cds lasted for ever they would keep pushing new content 
> out of the node. But if cds die off at some interval, that would be 
> limited. People could also look at the dates of the files on the cd 
> and skip anything that was older then what they wanted. The way I see 
> it, this adds more randomness to the network as it now has an aspect 
> that depends more on human behavior. 


What about putting 180 meg of the most recent keys in the datastore onto 
the cd.  These are likely to include the most popular content.  Then, 
when you import the data only overwite (bump out) older keys.  The 
import could report something like "X meg" or "Y % of the keys on the cd 
were imported".  When this muber aproaches zero, it's time to throw away 
the cd.

                Jeff



_______________________________________________
devl mailing list
devl at freenetproject.org
http://hawk.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to