Is there any way that http compression could be added to wget natively? http://en.wikipedia.org/wiki/HTTP_compression
I use wget to download lots of html pages. Downloading these pages uncompressed uses a lot of bandwidth for both me and the server. For example a 2GB uncompressed html download would take only ~400MB compressed with gzip. I know that setting --header="accept-encoding: gzip" in wget gets the server to return gzip html, but wget cannot parse this downloaded content for more links because wget has no internal decompressor. This means wget functionality is greatly reduced when using this method. HTTP compression would be a very useful feature that would make wget much more friendly to webmasters by saving a lot of bandwidth. I understand that wget is freely made by people donating their own time, I am not demanding anything, merely putting forth a suggestion that the developers might like. I hope I have come across as civil and polite. Thank you for your time.