Bruce Lo wrote:
> 
> I tried out Apache::GzipChain for dynamic mod_perl pages (using Apache::Registry), 
>and it was great for reducing the download time (especially over modem).  I am 
>seriously thinking about using it for our production environment.  However, some 
>people are concerned about it using up too much resource.  Has anyone looked into 
>scalability issues?  Would I see significant reduced throughput using GzipChain?

We've been gzipping for a while at eMerchandise.com (though not using
gzip chain). We addressed this issue by making the gzip pass decide
whether to just pass it through or to do the compression based on
current CPU load on the server.  So when you've got extra cycles you
shrink the file to improve bandwidth utilization, if you're running near
peak processor utilization you send the bytes raw.

We've had no scaling problems. What kind of system load do your
production server(s) see now?  What is it during peak traffic periods?

> Also why don't most sites gzip their pages (do redirect based on browser support)?

Because they're lazy or stupid? :)

-- 
Devin Ben-Hur     | President / CTO  | mailto:[EMAIL PROTECTED]
The eMarket Group | eMerchandise.com | http://www.eMerchandise.com
503/944-5044 x228 | 
"Forrester Research projects that by 2003, Internet start-ups will have
 focused so relentlessly on infrastructure that there will be no 
 remaining actual content on the Web. "  -- Salon.com 14-Apr-2000

Reply via email to