-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Wednesday 05 February 2003 05:58 am, Gordan Bobic wrote:
> On Tue, 4 Feb 2003, bdonlan wrote:
> > > 2.1) We care about supporting browsers thad don't support gzip
> > > compressed pages. Therefore, there is a requirement for a gzip
> > > decompressor in fproxy, so that it can uncompress the document for the
> > > browser that doesn't support the standard compression method. Fproxy
> > > decompresses the gzipped document, omits the Content-Encoding header,
> > > and passes back the plain text file.
> >
> > There's an easy-to-use Java class for gzip decoding/encoding built into
> > the Jave Runtime classes, but this would break compatibility. Maybe
> > support for reading them could be added, then on the next forced upgrade
> > add default compression?
>
> It would break compatibility ONLY if it was used. It would be up to the
> site authors to decide for themselves if they were more interested in
> performance or compatibility. The difference is that the file would
> potentially come down as gzipped without the Content-Encoding header being
> set. Obviously, this would come up as garbage in the browser.

Once again, implement the decoder, and do the automatic encoder after the next 
forced upgrade.

> > > 2.2) We DON'T care about browsers that don't support the gzip encoding.
> > > This is probably safe, because all commonly used browsers support this.
> > > In this case, the fproxy modifications would be much smaller. All it
> > > would have to do is look up the compression encoding on the file in the
> > > headers once it has downloaded it to the local node, and if set to
> > > "gzip", it would only have to pass back the standard headers, add the
> > > "Content-Encoding: gzip" header, and pass back the compressed file. It
> > > would only have to look up the file encoding, and set a header
> > > accordingly.
> >
> > Violates HTTP protocol by ignoting the Accept-Encoding header.
>
> No it doesn't. You don't have to gzip the response payload just because
> the browser supports it. The problem is that you could potentially be
> sending the data back gzipped when the browser hasn't declared it's
> ability to understand the gzipped content.
>
> Yes, it does break the strict HTTP compatibility, but then again, fproxy
> doesn't support HEAD requests either, which I would consider to be a much
> more serious compatibility issue.
>
> > > Obviously, changes to the insertion tools would be required, too...
> >
> > They *should* use freenet.client.*, but most don't. Those that do could
> > be easily updated by updating the instertion classes in freenet.jar.
>
> Hmm... In that case that shouldn't be as big a change as I initially
> thought. The issue there is do you then compress each file by default,
> provided it is of a compressible MIME type (obviously, there is no point
> in compressing a zip file)? Or do you leave it to the user to compress
> each file separately before they upload it? The advantage of transparently
> compressing each file before the upload is that it would make a site
> easier to test locally becure uploading it. But there is still the choice
> between doing it in the uploader program or in the node itself...
>
> Initially I thought about it as an optional, manual operation, but now I
> am leaning toward the benefits of the transparent operation.
>
> Regards.
>
> Gordan
>
>
> _______________________________________________
> Tech mailing list
> [EMAIL PROTECTED]
> http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/tech
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)

iD8DBQE+QWcyx533NjVSos4RAn4uAJ48W15DoZil+EG9r0EF4umfyCvJIQCgnxgJ
8Py2xU8Qp2/NSu8u4PKuwTg=
=lSWY
-----END PGP SIGNATURE-----


_______________________________________________
Tech mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/tech

Reply via email to