This is also true in the consumer space, and is the reason why ISPs can
save money by taking advantage of statistical multiplexing.  On average, I
personally could be satisfied with a megabit, but it's a real pain to
download gigabyte-class software updates at that speed.

If it takes me literally days to download a game's beta versions, any
comments I might have will be stale by the time they can be heard, and in
the meantime my other uses of the internet have been seriously impaired.
It's much more useful if I can download the same data in an hour, and spend
the remaining time evaluating.  So throughput is indeed a factor in
response time, once the size of the response is sufficiently large.

Occasionally, of course, practically everyone in the country wants to tune
into coverage of some event at the same time.  More commonly, they simply
get home from work and school at the same time every day.  That breaks the
assumptions behind pure statistical multiplexing, and requires a greater
provisioning factor.

- Jonathan Morton
_______________________________________________
Bloat mailing list
Bloat@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/bloat

Reply via email to