On Thursday 24 July 2003 12:02 pm, Gordan wrote: > On Thursday 24 July 2003 17:06, Tom Kaitchuck wrote: > > This is not true if all the Freesite insertion utilities do it properly. > > Toad said that it only supports up to 1MB after compression. This means > > that the entire container will ALWAYS be a single file. (Never broken up > > into chunks.) > > I understand that, but even so, it means that to view even the front page, > you have to download a 1 MB file.
I doubt that just the HTML for most sites would ever approach 1MB. Especially if it was zipped. > 1) Not everybody is on broadband > 2) I am not sure Freenet speed is good enough to deal with that > 3) Even if Freenet can deliver on this kind of bandwidth, it will create > huge amounts of totally unnecessary network traffic. I don't know about > your node, but mine has always eaten all bandwidth allocated to it very, > very quickly. If it is done like I describe, how is there ANY more unnecessary traffic than before? The only thing that is new is that you would get all the HTML for the site you are loading, and not just the first page when you request it. This isn't that much data and given that on Freenet the transfer time for a key is small compaired to delay before you get it, this is not a bad thing. This would greatly improve the users experience and enable Freesites to consist of many small pages instead of one huge one with #links. > While it may improve reliability of some sites in some cases, where they > may not work 10% of the time for 10% of the people, it seems that it will > create more problems than it will solve. The concept of unique files by key > (i.e. no duplication) was very good, and now we are effectively destroying > it. > > > The proper way for utilities to handle this, IMHO, would be to put all > > the HTML into one container. If it doesn't fit, or doesn't have enough > > "headroom" for edition sites, split it based on depth, so that there are > > two zips about the same size. Then make a separate zip for images. The > > only images that should be included are those that appear on multiple > > pages, or on the main page, excluding active links. They should be sorted > > biased on size and then only the smallest 1MB worth should be zipped. > > There in never any real reason for a single edition site to have more > > than one image zip. > > What if somebody has more than 1 MB work of imges on their front page? They > are not going to compress, so that benefit goes out the window, and they > will not fit in the archive, so that goes out too. That is true, but that is intentional. If someone has many big images, or one huge image, there is no point in including them in the archive. Because, like you said, they don't compress, and you would have to do a splitfile. These would be uploaded on the network, just like they do now. The point of a zip is to allow you to have many small files without fetching each individually. > > Then for dbr and edition sites, the utility should save a list of the > > files that were previously zipped together. This way it is sure to do it > > the same way every time, and it can add another zip if there are enough > > new images. > > Are you suggesting that a new version of the site requires the old > version's archives to be loaded? Surely not... I suggesting that the Image Zips be inserted under a CHK, that all versions of the site can reference. Or if they decide they want to change their layout, upload a new ZIP. The point is that all versions of a site, and multiple different sites can share a single ZIP theme. _______________________________________________ devl mailing list [EMAIL PROTECTED] http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl