[snip]
> 
> 
> What about putting 180 meg of the most recent keys
> in the datastore onto 
> the cd.  These are likely to include the most
> popular content.  Then, 
> when you import the data only overwite (bump out)
> older keys.  The 
> import could report something like "X meg" or "Y %
> of the keys on the cd 
> were imported".  When this muber aproaches zero,
> it's time to throw away 
> the cd.
> 
>               Jeff
> 

I really like this idea =) That seems like a sane way
of handling it. It seems that there might be concerns
for allways burning brand new data from the datastore
- is there anyone that is more familiar with how to
avoid giving away too much information that can
comment on this? How about the plausable denibility;
if you only important new content does that mean that
the transient node operator can't effictively claim
that an older key was not requested by himself?

Also, how about protecting the sneakernet media
carier? Is the previously mentioned method of just
copying the keys off the datastore sufficent? I'm not
familiar enough with what is actualy stored in the
datastore to make a judgement one way or the other. 

Internaly in fred, I would assume that doing an import
of a key would be trivial. I would assume that it
would not be quite as trivial as just inserting it
into the datastore directory from the os though, does
fred keep track of everything in the datastore? 

Tyler


=====
AIM:rllybites    Y! Messenger:triddle_1999

__________________________________________________
Do you Yahoo!?
Yahoo! Mail Plus - Powerful. Affordable. Sign up now.
http://mailplus.yahoo.com

_______________________________________________
devl mailing list
devl at freenetproject.org
http://hawk.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to