Joseph Solbrig wrote: > PPS. And of course, if someone posted a complete dailly index for usenet as > a key and others actually put posts on, you wouldn't get 105 GB > transmission cause no one would request all the article be sent to their > machine (they probably couldn't since one request has to end before another > begins). So freenet would be much more efficient at just the old stuff > unless folks want to force a highly inefficient protocol to ride on top of > the better stuff - IE put usenet in the freenet protocol itself. Yes, you are right. The 105 GB of Usenet would be the combined bandwidth.
My fundamental concern is the scalability of Freenet. Does Freenet scale to the extend that is can handle the bandwidth of Usenet? Can Freenet contain the more than 240,000,000 HTML pages (estimated by S. Lawrence, C.L. Giles) of WWW? If Mozilla comes with a standard Freenet:// implementation would the > 1,000,000 users (my personal estimation) be able to query Freenet with acceptable delays? When Freenet is applied on such a large scale, what would the bandwidth be for each node when relaying queries for the proposed methods? Can anyone give an answer to these questions, are people planning simulations, or do we want to use the trial and error method? Just my 2 eurocents, Johan. _______________________________________________ Freenet-dev mailing list Freenet-dev at lists.sourceforge.net http://lists.sourceforge.net/mailman/listinfo/freenet-dev
