Bug#502685: wget: Option to retry if download rate drops below a given limit for a given time

2009-07-22 Thread Josh Triplett
On Wed, Jul 22, 2009 at 07:57:08PM +0200, Noèl Köthe wrote:
> Am Mittwoch, den 22.07.2009, 15:58 +0200 schrieb Noèl Köthe:
> > > Some sites or networks fail in ways where a connection drops to a
> > > trickle (a few hundred or thousand bytes per second) but does not
> > > actually die; this can happen, for instance, if few or no network
> > > packets get through but no TCP disconnect occurs.  Killing wget and
> > > restarting it (always using -c) fixes the problem, but requires
> > > manually babysitting the download or writing a hackish script to do
> > > so.  It would help to have a wget option which monitors the download
> > > rate and treats the connection as failed if the rate drops below a
> > > given threshold for a given time (for instance, under 10Kbps for more
> > > than 5 seconds).
> > 
> > I forwarded this feature request, too.
> 
> Upstream closed the request with "wontfix".
> See https://savannah.gnu.org/bugs/index.php?27077

I've replied upstream with further discussion of the feature.

- Josh Triplett



--
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org



Bug#502685: wget: Option to retry if download rate drops below a given limit for a given time

2009-07-22 Thread Noèl Köthe
tags 502685 + wonfix
thanks

Hello Josh,

Am Mittwoch, den 22.07.2009, 15:58 +0200 schrieb Noèl Köthe:

> > Some sites or networks fail in ways where a connection drops to a
> > trickle (a few hundred or thousand bytes per second) but does not
> > actually die; this can happen, for instance, if few or no network
> > packets get through but no TCP disconnect occurs.  Killing wget and
> > restarting it (always using -c) fixes the problem, but requires
> > manually babysitting the download or writing a hackish script to do
> > so.  It would help to have a wget option which monitors the download
> > rate and treats the connection as failed if the rate drops below a
> > given threshold for a given time (for instance, under 10Kbps for more
> > than 5 seconds).
> 
> I forwarded this feature request, too.

Upstream closed the request with "wontfix".
See https://savannah.gnu.org/bugs/index.php?27077

-- 
Noèl Köthe 
Debian GNU/Linux, www.debian.org


signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil


Bug#502685: wget: Option to retry if download rate drops below a given limit for a given time

2009-07-22 Thread Noèl Köthe
forwarded 502685 https://savannah.gnu.org/bugs/index.php?27077
tags 502685 + upstream
thanks

Hello Josh,

Am Samstag, den 18.10.2008, 22:52 -0700 schrieb Josh Triplett:

> Some sites or networks fail in ways where a connection drops to a
> trickle (a few hundred or thousand bytes per second) but does not
> actually die; this can happen, for instance, if few or no network
> packets get through but no TCP disconnect occurs.  Killing wget and
> restarting it (always using -c) fixes the problem, but requires
> manually babysitting the download or writing a hackish script to do
> so.  It would help to have a wget option which monitors the download
> rate and treats the connection as failed if the rate drops below a
> given threshold for a given time (for instance, under 10Kbps for more
> than 5 seconds).

I forwarded this feature request, too.

-- 
Noèl Köthe 
Debian GNU/Linux, www.debian.org


signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil


Bug#502685: wget: Option to retry if download rate drops below a given limit for a given time

2008-10-18 Thread Josh Triplett
Package: wget
Version: 1.11.4-2
Severity: wishlist

Some sites or networks fail in ways where a connection drops to a
trickle (a few hundred or thousand bytes per second) but does not
actually die; this can happen, for instance, if few or no network
packets get through but no TCP disconnect occurs.  Killing wget and
restarting it (always using -c) fixes the problem, but requires
manually babysitting the download or writing a hackish script to do
so.  It would help to have a wget option which monitors the download
rate and treats the connection as failed if the rate drops below a
given threshold for a given time (for instance, under 10Kbps for more
than 5 seconds).

- Josh Triplett

-- System Information:
Debian Release: lenny/sid
  APT prefers unstable
  APT policy: (500, 'unstable'), (1, 'experimental')
Architecture: amd64 (x86_64)

Kernel: Linux 2.6.27-1-amd64 (SMP w/2 CPU cores)
Locale: LANG=en_US.UTF-8, LC_CTYPE=en_US.UTF-8 (charmap=UTF-8)
Shell: /bin/sh linked to /bin/bash

Versions of packages wget depends on:
ii  libc6 2.7-15 GNU C Library: Shared libraries
ii  libssl0.9.8   0.9.8g-13  SSL shared libraries

wget recommends no packages.

wget suggests no packages.

-- no debconf information



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]