On Thursday 25 October 2001 13:49, you wrote:
> In local.freenet, you wrote:
> > It occurred to me that there might be some benefit to inserting freesites
> > as a single redundant splitfile containing an archive of the site. (Or
> > two archives - one for the static portion and one for today's insert).
>
> Can you think up a reason why this isn't done with current webservers?
> And do these reasons still hold for Freenet? (You want to retrieve them
> monolithically, too, even though you only mention inserting above, right?)

Yes, I anticipated downloading the whole archive - I don't see how it could 
work otherwise. 

The disadvantage is obviously that not all the site data in a site archive 
might be needed to serve a given request and so some unnessessary data 
transfer would take place and also the latency of getting the initial page 
for a given site could increase. Presumably the added redundancy would 
improve the chances of receiving a complete site and might improve latency 
for receiving *all* of the site.

The extra download work can be minimised by choosing granularities of 
archives depending on site structure - I wouldn't expect a site sharing a 
load of 600Mb ISO images to bundle them in a single archive with a couple of 
hundred kb of site HTML + graphics for example. I'm not proposing that we tar 
and gzip the whole of freenet :-)

I'm happy to have a go at this if people think it might be useful. I can try 
some tests with konqueror (which can open websites inside .tar.gz 
transparently anyway) to see whether sites inserted this way seem more 
reliable.

degs




_______________________________________________
Devl mailing list
Devl at freenetproject.org
http://lists.freenetproject.org/mailman/listinfo/devl

Reply via email to