The problem to download from multiple server is that multiple servers may not be synchronized at the same time. Data synchronization is a serious problem. But I guess this should be better if we can make sure that multiple servers have the same package files.
On 10/22/07, Nicolas Alvarez <[EMAIL PROTECTED]> wrote: > > On 10/21/07, yueyu lin <[EMAIL PROTECTED]> wrote: > > As you noticed, sometimes, synaptic downloads packages slowly. I noticed > > that apt-get in fact can use multiple threads to download sometimes. But > > synaptic seems seldom to do this. > > I wanna know why? In fact, it will not be difficult to modify codes to > > support active multiple threads downloading. But it doesn't appear. I > guess > > developers think it's impolite to fork a lot of threads to download > things > > from server. The server may only serve for a few *rude* people. If it's > > true, I will stop any attempts to do so. If it's not,I can have a try to > do > > this. > > Using multiple connections to a single server is a sure way to "make > things worse", that is, overloading the server more. What does help is > making multiple connections to *different* servers. That actually > lowers load, since your download bandwidth is now dispersed between > servers, so you're using up less upstream bandwidth from each server. > I have been doing so to download the .iso's via HTTP (torrents go > slowly for me). > > -- > Nicolas > -- -- Yueyu Lin
-- Ubuntu-devel-discuss mailing list Ubuntu-devel-discuss@lists.ubuntu.com Modify settings or unsubscribe at: https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss