Re: wget bug?!

2002-02-19 Thread TD - Sales International Holland B.V.

On Monday 18 February 2002 17:52, you wrote:

That would be great. The prob is that I'm using it to retrieve files mostly 
on servers that are having too much users. No I don't want to hammer the 
server but I do want to keep on trying with reasonable intervals until I get 
the file.

I think the feature would be usuable in other scenarios as well. You now have 
--waitretry and --wait, in my personal opinion the best would perhaps be to 
add --waitint(er)(val) or perhaps just --int(er)(val)

Anyways, thanks for the reply.

Kind regards,

Ferry van Steen

> [The message I'm replying to was sent to <[EMAIL PROTECTED]>. I'm
> continuing the thread on <[EMAIL PROTECTED]> as there is no bug and
> I'm turning it into a discussion about features.]
>
> On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
> > I've tried -w 30
> > --waitretry=30
> > --wait=30 (I think this one is for multiple files and the time in between
> > those though)
> >
> > None of these seem to make wget wanna wait for 30 secs before trying
> > again. Like this I'm hammering the server.
>
> The --waitretry option will wait for 1 second for the first retry,
> then 2 seconds, 3 seconds, etc. up to the value specified. So you
> may consider the first few retry attempts to be hammering the
> server but it will gradually back off.
>
> It sounds like you want an option to specify the initial retry
> interval (currently fixed at 1 second), but Wget currently has no
> such option, nor an option to change the amount it increments by
> for each retry attempt (also currently fixed at 1 second).
>
> If such features were to be added, perhaps it could work something
> like this:
>
> --waitretry=n - same as --waitretry=n,1,1
> --waitretry=n,m   - same as --waitretry=n,m,1
> --waitretry=n,m,i - wait m seconds for the first retry,
> incrementing by i seconds for subsequent
> retries up to a maximum of n seconds
>
> The disadvantage of doing it that way is that no-one will remember
> which order the numbers should appear, so an alternative is to
> leave --waitretry alone and supplement it with --waitretryfirst
> and --waitretryincr options.



wget bug?!

2002-02-18 Thread TD - Sales International Holland B.V.

Hey there,

I wanna download a file at mustek's ftp site in america. This site has a 20 
users limit. Have a look at this:

bash-2.05# wget --wait=30 --waitretry=30 -t 0 
ftp://128.121.112.104/pub/1200UBXP/Web.EXE
--15:10:37--  ftp://128.121.112.104/pub/1200UBXP/Web.EXE
   => `Web.EXE'
Connecting to 128.121.112.104:21... connected!
Logging in as anonymous ...
The server refuses login.
Retrying.

--15:10:39--  ftp://128.121.112.104/pub/1200UBXP/Web.EXE
  (try: 2) => `Web.EXE'
Connecting to 128.121.112.104:21... connected!
Logging in as anonymous ...
The server refuses login.
Retrying.

I've tried -w 30
--waitretry=30
--wait=30 (I think this one is for multiple files and the time in between 
those though)

None of these seem to make wget wanna wait for 30 secs before trying again. 
Like this I'm hammering the server.

Please feel free to smack me if I overlooked anything, I used wget --help to 
find out these options. Wget version is bash-2.05# wget --version
GNU Wget 1.7

Copyright (C) 1995, 1996, 1997, 1998, 2000, 2001 Free Software Foundation, 
Inc.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

Originally written by Hrvoje Niksic <[EMAIL PROTECTED]>.

distributed with slackware 8.0 by Patrick Volkerding

regards,

Ferry van Steen