Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-05 Thread Tim Sweetman
Ben was also seeing: ... some of the problems caused by not having a (strict | anal | strong | paranoid | batshit ) type system. Certain types of bugs persist for far longer than they should in > 10 line Perl applications whereas a less laissez-faire type system would flush them out basically t

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-05 Thread David Cantrell
On Fri, Sep 05, 2003 at 10:59:31AM +0100, Ben wrote: > Well, that is true, but I'm also seeing some of the problems caused by not > having a (strict | anal | strong | paranoid | batshit ) type system. Certain > types of bugs persist for far longer than they should in > 10 line > Perl applicati

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-05 Thread Ben
On Fri, Sep 05, 2003 at 09:46:47AM +0100, Simon Wistow wrote: > On Thu, Sep 04, 2003 at 06:44:25PM +0100, Phil Lanch said: > > On Thu, Sep 04, 2003 at 03:40:18PM +0100, David Cantrell wrote: > > > It's just this sort of thing that makes me lurve perl. > > > > you mistyped "C++". > > Without getti

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-05 Thread Dominic Mitchell
Simon Wistow <[EMAIL PROTECTED]> wrote: > On a tangentially related note, I'm very rapidly starting to come to the > opinion that there are far too many applications that are written in > C/C++ which don't need to. I heartily agree. I think that the combination of a scripting language plus som

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-05 Thread Simon Wistow
On Thu, Sep 04, 2003 at 06:44:25PM +0100, Phil Lanch said: > On Thu, Sep 04, 2003 at 03:40:18PM +0100, David Cantrell wrote: > > It's just this sort of thing that makes me lurve perl. > > you mistyped "C++". Without getting into a flamewar, and whilst appreciating the benefits of compile time ge

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Phil Lanch
On Thu, Sep 04, 2003 at 06:57:41PM +0100, Shevek wrote: > On Thu, 4 Sep 2003, Phil Lanch wrote: > > you mistyped "C++". > > I consider myself to be a programmer. Having read this code, my only > possible response is, "You what?" > > AICMFP. sorry, i forgot to say: #include > > HTH. HAND. th

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Shevek
On Thu, 4 Sep 2003, Phil Lanch wrote: > On Thu, Sep 04, 2003 at 03:40:18PM +0100, David Cantrell wrote: > > It's just this sort of thing that makes me lurve perl. > > you mistyped "C++". I consider myself to be a programmer. Having read this code, my only possible response is, "You what?" AICMF

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Phil Lanch
On Thu, Sep 04, 2003 at 03:40:18PM +0100, David Cantrell wrote: > It's just this sort of thing that makes me lurve perl. you mistyped "C++". class fleeg { }; class quirka { public: quirka () { f = auto_ptr (new fleeg); } private: auto_ptr f; }; class miner { public: miner () { q = a

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Ben
On Thu, Sep 04, 2003 at 03:07:08PM +0100, Lusercop wrote: > On Thu, Sep 04, 2003 at 01:40:08PM +0100, Ben wrote: > > return foo; > > > > FAIL3: > > free(foo->quirka->fleeg); > > return NULL; > > FAIL2: > > free(foo->quirka); > > return NULL; > > FAIL1: > > free(foo);

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread David Cantrell
On Thu, Sep 04, 2003 at 03:07:08PM +0100, Lusercop wrote: > what's wrong with: > > | if(foo) { > | if(foo->quirka) { > | free(foo->quirka->fleeg); > | } > | free(foo->quirka); > | } > | free(foo); > > In the error condition? Gets a bit unweildy if you have foo->quirka->fleeg->miner->wi

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Lusercop
On Thu, Sep 04, 2003 at 01:40:08PM +0100, Ben wrote: > return foo; > > FAIL3: > free(foo->quirka->fleeg); > return NULL; > FAIL2: > free(foo->quirka); > return NULL; > FAIL1: > free(foo); > return NULL; > } > > With nested structures like these, this structur

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Ben
On Thu, Sep 04, 2003 at 01:03:16PM +0100, Lusercop wrote: > On Wed, Sep 03, 2003 at 11:48:19AM +0100, Simon Wistow wrote: > > I have to admit, I like gotos in C. This is not a winning testimonial > > though. I've been told that my C is like Object Orientated assembler > > which is fair enough bec

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-04 Thread Lusercop
On Wed, Sep 03, 2003 at 11:48:19AM +0100, Simon Wistow wrote: > I have to admit, I like gotos in C. This is not a winning testimonial > though. I've been told that my C is like Object Orientated assembler > which is fair enough because I learnt C after I'd learnt 68k. Hmmm, I like gotos too, but

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Chris Devers
On Wed, 3 Sep 2003, muppet wrote: > stop the wrongful slander of goto! Man, what a muppet this guy is... Look, goto's are just bad, mmmkay? -- Chris Devers[EMAIL PROTECTED] channeling http://www.askoxford.com/pressroom/archive/odelaunch/>

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread muppet
Nicholas Clark said: > Parrot has much cleaner source than Perl 5. However, to maintain the > balance of good and evil^Wgoto, Perl 6 will compile down to parrot > bytecode, which quite definitely does have gotos. So even the nicest, > most clean award winning code from the purest best intentioned

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Andy Wardley
Paul Johnson wrote: > I think I wrote my first ever goto code in C yesterday. Way back when I was a teen-geek, I played around writing a few games, mostly in C, with the odd bit of assembler thrown in for bad taste. One of these was a rip-off of the classic Tron light-cycle game. I got myself i

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Dominic Mitchell
Rafael Garcia-Suarez <[EMAIL PROTECTED]> wrote: > In bleadperl : > $ perl -lne 'print if /\bgoto\b/' *.[ch] | wc -l > 605 > > This is a rough metric, there are probably less actual gotos than this > (because of comments and because "goto" is a perl keyword -- not > forgetting the yacc-generate

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Sam Vilain
On Wed, 03 Sep 2003 11:17, Rafael Garcia-Suarez wrote; > However most of gotos appear to be in the tokenizer and in the > regular expression engine. Thoee are based on state machines, and > IMHO gotos are legitimate in state machines. Right, and we all know that every program can be conside

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Simon Wistow
On Wed, Sep 03, 2003 at 11:13:35AM +0100, Nicholas Clark said: > At the risk of going off topic, the Perl 5 source isn't exactly pleasant. > And contains gotos. IIRC I added 2 between 5.6.0 and 5.8.0, but the > alternative was a big mess of if()s and braces. C doesn't have all the > nice loop label

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Paul Johnson
Nicholas Clark said: > At the risk of going off topic, the Perl 5 source isn't exactly pleasant. > And contains gotos. IIRC I added 2 between 5.6.0 and 5.8.0, but the > alternative was a big mess of if()s and braces. C doesn't have all the > nice loop labelling features of a certain other languag

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Rafael Garcia-Suarez
Nicholas Clark wrote: > > At the risk of going off topic, the Perl 5 source isn't exactly pleasant. > And contains gotos. IIRC I added 2 between 5.6.0 and 5.8.0, but the > alternative was a big mess of if()s and braces. C doesn't have all the > nice loop labelling features of a certain other langu

Re: Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Nicholas Clark
On Wed, Sep 03, 2003 at 09:53:21AM +, Dominic Mitchell wrote: > Yuck. I didn't actually look at it, just let the ports compile it for > me. I didn't inspect it too far, but it seems that the current source is safe to look at. I seems to have benefited from a complete re-write > When it come

Re: DOS/WIN archivers of the mid 1990s (was Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Roger Burton West
On Wed, Sep 03, 2003 at 09:49:28AM +, Dominic Mitchell wrote: >You can get unrar as source code. I posted the link yesterday. Yes, but not the compressor. Ditto for ACE and ARJ. So there's no way to originate a RAR file under Linux without using binary-only software, and any other Unix will

Bad C Source (Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Dominic Mitchell
Nicholas Clark <[EMAIL PROTECTED]> wrote: > On Tue, Sep 02, 2003 at 07:16:40AM +, Dominic Mitchell wrote: >> For the benefit of people likely to come up against Yet Another >> Compression Format, though: >> >> http://files10.rarlab.com/rar/unrarsrc-3.2.3.tar.gz > > The code in there is a

Re: DOS/WIN archivers of the mid 1990s (was Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Dominic Mitchell
Roger Burton West <[EMAIL PROTECTED]> wrote: > On Unix, RAR and ACE are only available as binaries, which puts off a > lot of people; and neither those nor ZIP preserves file ownership or > permission information. So while I'm able to extract most files under > Unix, I wouldn't choose those formats

Re: DOS/WIN archivers of the mid 1990s (was Re: gzipping your websites WINRAR 40 days trial)

2003-09-03 Thread Roger Burton West
On Tue, Sep 02, 2003 at 10:05:02PM +0100, Barbie [home] wrote: >On 02 September 2003 09:43 Roger Burton West wrote: >> (All of this >> only applies to the Windows world, obviously; I think the parallels in >> Unix, or at least Linux, would be .tar.bz2, .tar.gz, and dodgy >> commercial software with

Re: gzipping your websites WINRAR 40 days trial

2003-09-02 Thread Nicholas Clark
On Tue, Sep 02, 2003 at 07:16:40AM +, Dominic Mitchell wrote: > For the benefit of people likely to come up against Yet Another > Compression Format, though: > > http://files10.rarlab.com/rar/unrarsrc-3.2.3.tar.gz The code in there is a lot cleaner than the last time I looked at it. (I pr

RE: DOS/WIN archivers of the mid 1990s (was Re: gzipping your websites WINRAR 40 days trial)

2003-09-02 Thread Barbie [home]
On 02 September 2003 09:43 Roger Burton West wrote: > On Tue, Sep 02, 2003 at 09:24:11AM +0200, Philip Newton wrote: > > [re RAR] > >> I'm told it's fairly popular in (some?) Usenet binary newsgroups as a >> standard way of distributing warez and moviez. > > ACE is another format that I understand

Re: gzipping your websites

2003-09-02 Thread Ben
On Mon, Sep 01, 2003 at 12:40:53PM +0100, Simon Wistow wrote: > > Now, gzipping your outgoing webpages is a good thing cos it cuts down on > transmission time and reduces your bandwidth costs. > > Unfortunately it's also fairly processor intensive. > > So I'm looking for a external solution and

Re: DOS/WIN archivers of the mid 1990s (was Re: gzipping your websites WINRAR 40 days trial)

2003-09-02 Thread Dominic Mitchell
Roger Burton West <[EMAIL PROTECTED]> wrote: > In my experience, people who really care about compressed file size and > are moderately technically savvy tend to use RAR or ACE; people who > want their files to be readable by everybody use ZIP; people who are > catering for virus-prone fools use

DOS/WIN archivers of the mid 1990s (was Re: gzipping your websites WINRAR 40 days trial)

2003-09-02 Thread Roger Burton West
On Tue, Sep 02, 2003 at 09:24:11AM +0200, Philip Newton wrote: [re RAR] >I'm told it's fairly popular in (some?) Usenet binary newsgroups as a >standard way of distributing warez and moviez. ACE is another format that I understand is used in that context. >>From what I gather, it supports mult

Re: gzipping your websites WINRAR 40 days trial

2003-09-02 Thread Jason Clifford
On Tue, 2 Sep 2003, the hatter wrote: > > > It's certainly not what I'd call anywhere close to being "standard" or > > > "universal". > > > > I'm told it's fairly popular in (some?) Usenet binary newsgroups as a > > standard way of distributing warez and moviez. > > Certainly a majority of warez

Re: compression (was: gzipping your websites)

2003-09-02 Thread Tom Hukins
On Tue, Sep 02, 2003 at 01:13:56AM +0100, Sam Vilain wrote: > On Mon, 01 Sep 2003 21:11, Tom Hukins wrote; > > > Also, I suspect the case for bzip2 becomes stronger in the future. > > Assume Moore's Law continues to hold for CPU speed increase. Disk > > and > > This argument is irrelevant

Re: gzipping your websites WINRAR 40 days trial

2003-09-02 Thread the hatter
On Tue, 2 Sep 2003, Philip Newton wrote: > On 2 Sep 2003 at 7:16, Dominic Mitchell wrote: > > > It's certainly not what I'd call anywhere close to being "standard" or > > "universal". > > I'm told it's fairly popular in (some?) Usenet binary newsgroups as a > standard way of distributing warez and

Re: gzipping your websites WINRAR 40 days trial

2003-09-02 Thread Philip Newton
On 2 Sep 2003 at 7:16, Dominic Mitchell wrote: > It's certainly not what I'd call anywhere close to being "standard" or > "universal". I'm told it's fairly popular in (some?) Usenet binary newsgroups as a standard way of distributing warez and moviez. >From what I gather, it supports multi-vol

Re: gzipping your websites

2003-09-02 Thread Dominic Mitchell
Philip Newton <[EMAIL PROTECTED]> wrote: > On 1 Sep 2003 at 21:58, Tim Sweetman wrote: > >> Does all this negotiation run into hot water with legacy p(r)oxy caches? > > I believe someone mentioned that they couldn't get their cache to cache > the contents if they sent the proper HTTP header ("Va

Re: gzipping your websites

2003-09-02 Thread Philip Newton
On 1 Sep 2003 at 21:58, Tim Sweetman wrote: > Does all this negotiation run into hot water with legacy p(r)oxy caches? I believe someone mentioned that they couldn't get their cache to cache the contents if they sent the proper HTTP header ("Vary: encoding", I believe, meaning "Hey, proxy, the

Re: gzipping your websites WINRAR 40 days trial

2003-09-02 Thread Dominic Mitchell
[EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: > I used to be annoyed when someone zipped and the rared. Winzip cannot even > handle this yet. Nowadays I can just say that RAR is more universial the > Zip. That seem unlikely at best. I'd never even heard of winrar until somebody at work pointed it

Re: compression (was: gzipping your websites)

2003-09-02 Thread Sam Vilain
On Mon, 01 Sep 2003 21:11, Tom Hukins wrote; > Also, I suspect the case for bzip2 becomes stronger in the future. > Assume Moore's Law continues to hold for CPU speed increase. Disk > and This argument is irrelevant in general, because the size of files to be compressed in general also inc

Re: compression (was: gzipping your websites)

2003-09-02 Thread Philip Newton
On 1 Sep 2003 at 20:52, Paul Mison wrote: > However, here's some rampantly unfair tests on a text file: [snip] > [EMAIL PROTECTED]:~$ ls -l File-Type.html* > -rw-r--r--1 blech 3631 Sep 1 20:51 File-Type.html Yes, that sounds rather unfair to bzip2. >From what (little) I know, it doesn't r

Re: compression (was: gzipping your websites)

2003-09-01 Thread muppet
On Monday, September 1, 2003, at 03:52 PM, Paul Mison wrote: [EMAIL PROTECTED]:~$ ls -l File-Type.html* -rw-r--r--1 blech 3631 Sep 1 20:51 File-Type.html -rw-r--r--1 blech 1370 Sep 1 20:49 File-Type.html.gz -rw-r--r--1 blech 1449 Sep 1 20:49 File-Type.html.bz2 I maintain that D

Re: gzipping your websites

2003-09-01 Thread Tim Sweetman
Philip Newton wrote: On 1 Sep 2003 at 12:40, Simon Wistow wrote: Now, gzipping your outgoing webpages is a good thing cos it cuts down on transmission time and reduces your bandwidth costs. Unfortunately it's also fairly processor intensive. Which is why I heard that it's better to comp

Re: compression (was: gzipping your websites)

2003-09-01 Thread Tom Hukins
On Mon, Sep 01, 2003 at 07:13:47PM +0100, Paul Mison wrote: > bzip2 is much, much slower than gzip, but doesn't provide > significantly smaller files I've heard this argument before. I agree with the point, but bzip2 works well for some current problems: we've already heard about compressing We

Re: compression (was: gzipping your websites)

2003-09-01 Thread Chris Devers
On Mon, 1 Sep 2003, Paul Mison wrote: > I maintain that David is being pointlessly discriminatory in his > approach, and that bzip2 has a bogus level of popularity that its > merits don't justify. And I maintain that you're ignoring the possibility that in certain contexts, bzip can be the approp

Re: compression (was: gzipping your websites)

2003-09-01 Thread Paul Mison
On 01/09/2003 at 20:26 +0100, Roger Burton West wrote: How often do you serve a MySQL file that's 2.5GB uncompressed? Is this typical of most web applications? If not, why bring in that particular benchmark in the first place? It was a page I'd seen recently and I thought that the information it

Re: compression (was: gzipping your websites)

2003-09-01 Thread Chris Devers
On Mon, 1 Sep 2003, Paul Mison wrote: > On 01/09/2003 at 14:42 -0400, Chris Devers wrote: > > >But at my last job, when compressing daily server logs, bzip was able to > >produce compressed files half to quarter the size of what gzip could do > >with the same log files. Consistently, over the cour

Re: compression (was: gzipping your websites)

2003-09-01 Thread Roger Burton West
On Mon, Sep 01, 2003 at 08:20:32PM +0100, Paul Mison wrote: >How often do you *serve* log files? Wasn't the discussion up until >now about gzip for web server downloads? How often do you serve a MySQL file that's 2.5GB uncompressed? Is this typical of most web applications? If not, why bring in

Re: compression (was: gzipping your websites)

2003-09-01 Thread Paul Mison
On 01/09/2003 at 14:42 -0400, Chris Devers wrote: But at my last job, when compressing daily server logs, bzip was able to produce compressed files half to quarter the size of what gzip could do with the same log files. Consistently, over the course of months. How often do you *serve* log files? W

Re: gzipping your websites

2003-09-01 Thread David Cantrell
Chris Devers wrote: [a lot, including ...] > So... no, bzip file support is unlikely unless they're savvy enough to have installed Cygwin, PowerArchiver, or similar tools. Which most Windows users probably haven't & probably never will. Their loss. Yes, yes it is. I'm not going to use worse compr

Re: compression (was: gzipping your websites)

2003-09-01 Thread Chris Devers
On Mon, 1 Sep 2003, Paul Mison wrote: > On 01/09/2003 at 12:35 -0400, Chris Devers wrote: > > Unfortunately WinZip does not unzip bzip2 [trolltech.com] > > bzip2 is much, much slower than gzip, but doesn't provide significantly > smaller files, at least according to some benchmarks on Jere

Re: compression (was: gzipping your websites)

2003-09-01 Thread Roger Burton West
On Mon, Sep 01, 2003 at 07:13:47PM +0100, Paul Mison wrote: >bzip2 is much, much slower than gzip, but doesn't provide >significantly smaller files, at least according to some benchmarks on >Jeremy Zawodny's site: > >http://jeremy.zawodny.com/blog/archives/000953.html Well, golly, on one partic

Re: compression (was: gzipping your websites)

2003-09-01 Thread Paul Mison
On 01/09/2003 at 12:35 -0400, Chris Devers wrote: Unfortunately WinZip does not unzip bzip2 [trolltech.com] bzip2 is much, much slower than gzip, but doesn't provide significantly smaller files, at least according to some benchmarks on Jeremy Zawodny's site: http://jeremy.zawodny.com/blo

Re: gzipping your websites WINRAR 40 days trial

2003-09-01 Thread [EMAIL PROTECTED]
] Subject: Re: gzipping your websites On Mon, 1 Sep 2003, David Cantrell wrote: > Oh I don't care if the browser doesn't support it, just whether there is > common Doze software to uncompress it later like there is for tarballs. Well, everyone using Windows is likely to be using

Re: gzipping your websites

2003-09-01 Thread Chris Devers
On Mon, 1 Sep 2003, David Cantrell wrote: > Oh I don't care if the browser doesn't support it, just whether there is > common Doze software to uncompress it later like there is for tarballs. Well, everyone using Windows is likely to be using either a copy of WinZip (unpaid for, of course) or if t

Re: gzipping your websites

2003-09-01 Thread Philip Newton
On 1 Sep 2003 at 16:33, David Cantrell wrote: [bzip2] > Oh I don't care if the browser doesn't support it, just whether there is > common Doze software to uncompress it later like there is for tarballs. There are bzip2 binaries available for Doze but I highly doubt they're commonly installed. .t

Re: gzipping your websites

2003-09-01 Thread Nigel Hamilton
> >Has anyone managed to get content negotiation to work for pre-compressed > >static content? ... this is the ideal scenario speed-wise. > > Er... I may be missing something, but isn't this as easy as checking > the "Accept-Encoding" input header, and if properly set, redirect > internally to t

Re: gzipping your websites

2003-09-01 Thread Nicholas Clark
On Mon, Sep 01, 2003 at 04:23:51PM +0200, Elizabeth Mattijsen wrote: > At 15:15 +0100 9/1/03, David Cantrell wrote: > >stuck in the Microsoftian Dark Ages can. In fact, can they easily deal > >with gzips? > > M$ browsers have been able to deal with gzipped content for a long > time already...

Re: gzipping your websites

2003-09-01 Thread Philip Newton
On 1 Sep 2003 at 10:31, Nigel Hamilton wrote: > I tried using Apache MultiViews to served pre-compressed static content, > but to no avail. The idea was Apache would map a request for index.html to > index.html.gz and serve the precompressed file instead. In the same way > that language encoded

Re: gzipping your websites

2003-09-01 Thread Dominic Mitchell
Nigel Hamilton <[EMAIL PROTECTED]> wrote: > I tried using Apache MultiViews to served pre-compressed static content, > but to no avail. The idea was Apache would map a request for index.html to > index.html.gz and serve the precompressed file instead. In the same way > that language encoded file

Re: gzipping your websites

2003-09-01 Thread Elizabeth Mattijsen
At 10:31 -0500 9/1/03, Nigel Hamilton wrote: > Philip Newton <[EMAIL PROTECTED]> wrote: > Static Content is a Good Thing[tm]. You should try to have as much of it as possible. The you can let Apache do the hard work of implementing > HTTP correctly (caching, byte ranges, etc). Has anyone mana

Re: gzipping your websites

2003-09-01 Thread David Cantrell
On Mon, Sep 01, 2003 at 03:26:36PM +0100, Jason Clifford wrote: > On Mon, 1 Sep 2003, David Cantrell wrote: > > I don't do the uncompression on the fly, but do people think it's > > reasonable for me to have bzipped files on my site? Obviously, people > > using sane systems can deal with them, but

Re: gzipping your websites

2003-09-01 Thread Robin Berjon
Philip Newton wrote: I don't think that is the case with bzip though. Ditto. I haven't heard of a browser that can handle bzip2 (though I imagine Mozilla can do it if built with -DKITCHEN_SINK), and most windows users will not have bunzip2 on their systems. Konqueror has been able to handle bzip

Re: gzipping your websites

2003-09-01 Thread Nigel Hamilton
> Philip Newton <[EMAIL PROTECTED]> wrote: > > Of course, this only really works well for static content. > > Static Content is a Good Thing[tm]. You should try to have as much of > it as possible. The you can let Apache do the hard work of implementing > HTTP correctly (caching, byte ranges, et

Re: gzipping your websites

2003-09-01 Thread Philip Newton
On 1 Sep 2003 at 15:26, Jason Clifford wrote: > On Mon, 1 Sep 2003, David Cantrell wrote: > > > I don't do the uncompression on the fly, but do people think it's > > reasonable for me to have bzipped files on my site? Obviously, people > > using sane systems can deal with them, but I have no ide

Re: gzipping your websites

2003-09-01 Thread Dominic Mitchell
Philip Newton <[EMAIL PROTECTED]> wrote: > Of course, this only really works well for static content. Static Content is a Good Thing[tm]. You should try to have as much of it as possible. The you can let Apache do the hard work of implementing HTTP correctly (caching, byte ranges, etc). -Dom -

Re: gzipping your websites

2003-09-01 Thread Jason Clifford
On Mon, 1 Sep 2003, David Cantrell wrote: > I don't do the uncompression on the fly, but do people think it's > reasonable for me to have bzipped files on my site? Obviously, people > using sane systems can deal with them, but I have no idea whether those > stuck in the Microsoftian Dark Ages can

Re: gzipping your websites

2003-09-01 Thread Elizabeth Mattijsen
At 15:15 +0100 9/1/03, David Cantrell wrote: On Mon, Sep 01, 2003 at 03:56:12PM +0200, Philip Newton wrote: > Which is why I heard that it's better to compress your content ahead of them and then *uncompress* it on the fly if the client says it can't > read gzip content-transfer-encoding(?). I d

Re: gzipping your websites

2003-09-01 Thread David Cantrell
On Mon, Sep 01, 2003 at 03:56:12PM +0200, Philip Newton wrote: > Which is why I heard that it's better to compress your content ahead of > them and then *uncompress* it on the fly if the client says it can't > read gzip content-transfer-encoding(?). I don't do the uncompression on the fly, but

Re: gzipping your websites

2003-09-01 Thread Philip Newton
On 1 Sep 2003 at 12:40, Simon Wistow wrote: > Now, gzipping your outgoing webpages is a good thing cos it cuts down on > transmission time and reduces your bandwidth costs. > > Unfortunately it's also fairly processor intensive. Which is why I heard that it's better to compress your content ahe

Re: gzipping your websites

2003-09-01 Thread Dominic Mitchell
Robin Berjon <[EMAIL PROTECTED]> wrote: > Simon Wistow wrote: >> Apache or Squid as reverse proxy with mod_gzip >> -- >> http://www.apache.org >> http://www.squid-cache.org/ >> Pros: (F|f)ree! >> Cons: More developer time needed? > > In my experience th

Re: gzipping your websites

2003-09-01 Thread Robin Berjon
Simon Wistow wrote: Apache or Squid as reverse proxy with mod_gzip -- http://www.apache.org http://www.squid-cache.org/ Pros: (F|f)ree! Cons: More developer time needed? In my experience this works well, and I'd recommend it. To gain time I also recommen

Re: gzipping your websites

2003-09-01 Thread Roger Burton West
On Mon, Sep 01, 2003 at 12:40:53PM +0100, Simon Wistow wrote: >Now, gzipping your outgoing webpages is a good thing cos it cuts down on >transmission time and reduces your bandwidth costs. >Apache or Squid as reverse proxy with mod_gzip >-- >http://www.

Re: gzipping your websites

2003-09-01 Thread Simon Wistow
On Mon, Sep 01, 2003 at 12:40:53PM +0100, Simon Wistow said: > Has anybody else heard of any other solutions and or had any experience > with any of these? Particularly how well Apache/Squid performs. Hmm, > actually, looking at it Squid doesn't seem to be able to do transparent > gzipping. I

gzipping your websites

2003-09-01 Thread Simon Wistow
Now, gzipping your outgoing webpages is a good thing cos it cuts down on transmission time and reduces your bandwidth costs. Unfortunately it's also fairly processor intensive. So I'm looking for a external solution and wondered if people had had any experience. The solutions I've looked at s