On Mon, Apr 02, 2007 at 12:47:58PM +0200, Francois Petillon wrote:
> some client are using 16 to 32 KB blocks while downloading ISOs.
> Even without REST command, some clients keep on trying to do segmented
> download
> I am wondering who designed a FTP client that is retrying thousands time
> to (resume a) download.
Repeating connections all the time iduces extra overhead as more bytes
transferred per each chunk because of more headers and more request data.
It also prevents the TCP window size enlargement mechanism from kicking in,
because there's no time to grow it from the default to the maximum size,
each new connection starts over from the default. To top it all off, this
also makes server admins more likely to want to kill you :)
So I really can't see how this is supposed to accelerate a download.
It's more likely to decelerate it! It's just crazy.
Maybe it's a known bug in some popular downloading software. I'm not logging
User-Agents, maybe I should start...
Or maybe they're trying to emulate P2P file sharing methods with
real servers? Maybe they extended existing p2p programs and couldn't
be bothered to simplify their download functions to work normally? :)
--
2. That which causes joy or happiness.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]