You make a good point that you put an extra load on the server by
compressing each time (even if only a little because gzip is pretty fast).
But this can be solved by caching the resulting big JS file, whether you do
this with a file cache (my solution) or memcache (should even be more
efficient). I have measured significant speed improvements by caching and
gzipping. And each time i even change a single byte I just refresh.  And
besides when you pack the code you just let the client side waste a lot of
CPU with each page load. This wasted CPU will become non-trivial on older
computers and browsers and definitely when your JS code becomes over several
hundred KiB (like Jquery UI).

When you are done developing and prepare a release you can also just minify
the JS, and let the script add them together and gzip again. This will
improve the compression without any performance hit on the client side (min
+ gzip works better than any packer).

And a last note: minifying or packing the JS is no option when you need to
debug that JS code. :-)

Regards,
THD



On Fri, Jul 3, 2009 at 4:22 PM, Rebecca Murphey <rmurp...@gmail.com> wrote:

> Having PHP do the gzipping is inefficient, IMO. The PHP has to run
> every time someone requests the file anew; if you compress the files
> as part of the release process, then they are static files that the
> server simply has to serve.
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"jQuery Development" group.
To post to this group, send email to jquery-dev@googlegroups.com
To unsubscribe from this group, send email to 
jquery-dev+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/jquery-dev?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to