+ 1 to François's comments.

You're not saying that gzipping and wise pre-fetching and parallel download
> of scripts don't improve page load times. Or are you?
>

- We already have transfer-encoding in HTTP, and yes, you should definitely
use it!
- Prefetching is also an important optimization, but in the context of this
discussion (bundling), it's an orthogonal concern.


> In the equation you paint above something important is missing: the fact
> that there's a round-trip delay per request (even with http2.0), and that
> the only way to avoid it is to bundle things, as in .zip bundling, to
> minimize the (number of requests and thus the) impact of latencies.
>

With HTTP 1.x (and without sharding) you can fetch up to six resources in
parallel. With HTTP 2.0, you can fetch as many resources as you wish in
parallel. The only reason bundling exists as an "optimization" is to work
around the limit of six parallel requests. The moment you remove that
limitation, bundling is unnecessary and only hurts performance.


> And there's something else I think .zip bundling can provide that http2.0
> can't: the guarantee that a set of files are cached by the time your script
> runs: with such a guarantee you could do synchronous module require()s, à
> la node.js.
>

This is completely orthogonal... if you need to express dependencies
between multiple resources, use a loader script, or better.. look into
using upcoming promise API's. As I mentioned previously, bundling breaks
streaming / incremental execution / prioritization.

ig
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to