Your message dated Fri, 14 Aug 2015 22:01:09 +0200
with message-id <20150814200109.GA21965@crossbow>
and subject line Re: apt: parallel downloads from the same host
has caused the Debian Bug report #158486,
regarding apt-get must use faster download algorithms or aria2 as a backend 
downloader
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
158486: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=158486
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Package: apt
Version: 0.7.14

apt-get apparently currently uses some simple wget-type sequential
download algorithm for all its functions that download anything from
the net, like update, install, dist-upgrade, upgrade, build-dep,
source etc. This really does not well use the available broadband
bandwidth of most users. Hence I am forced to use home-cooked scripts
to extract the URLs to download and then pass them to aria2 (GPL) to
download, so that the thing is done faster. If it is built-in,
everyone can benefit.

Granted, the user may not want his entire bandwidth hogged by apt-get,
but at least provide this as an option -- use a faster split-download
algorithm, or at least use a good backend downloader like aria2 which
provides high-speed downloading.

I am a downstream (Ubuntu) user of Debian apt, and the downstream
people redirected me to here from
http://bugs.launchpad.net/bugs/275994

Shriramana Sharma.



--- End Message ---
--- Begin Message ---
Hi

On Tue, Aug 27, 2002 at 03:57:46PM +0200, Marek Michalkiewicz wrote:
> For some situations, it would be nice if apt-get had support for
> downloading more than one file in parallel from the same server.
> As it is now, it can only download files from different servers
> at the same time.  Doing many parallel downloads is not nice to the
> network and so should not be the default, but may be useful in some
> situations, like load balancing between 2 links with different IPs
> ("ip route add default equalize nexthop dev ppp0 nexthop dev ppp1")
> where each one TCP connection can only use half of total bandwidth.

This was never implemented as it is indeed not nice for mirrors.  The
advent of various scripts doing it anyway just shows how this would have
been and still would be missused to put pressure on the mirrors for no
gain and even worse results in the long run.

What should be done can serve as a good and nice quasi-replacement are
redirector services like httpredir.debian.org which will use up all your
bandwidth while being nice to mirrors (and in fact distributing the load
better as even new and unknown mirrors will be used automatically).

Hence closing as "superseded by a better idea with working
implementation".


Best regards

David Kalnischkies

Attachment: signature.asc
Description: Digital signature


--- End Message ---

Reply via email to