> Wolfgang Pfeiffer stated the following:

> > But let me point to this: 'wget' does ftp *and* http -- I find it very
> > useful for downloading html-ized manuals: wget is able to follow links on
> > html-pages and then download them ... perhaps a bit complicated for
> > beginners as I am, but once understood it seems to be very fast and
> > extremely useful for people wanting to let jobs being done by the machine
> > that otherwise would have to be done by humans ... :)

It got snipped earlier, and you must of missed it, but I said I do use
wget. All the time in fact. However there are two reasons I don't use it
for some applications (like downloading rpm updates from redhat)

1. I can't figure out the option to get wget to retry when the server says
it's too busy to accept any more connections. -t doesn't fix this.

2. if you want to kick off multiple downloads (and yet still get
feedback) you need to open multiple terminals.

Try lftp. You will find it solves both of these most admirably. I can
start several simultaneous doenloads ans see their status anytime. If the
sevre it busy it keeps trying to reconnect.

On Thu, 14 Dec 2000, SoloCDM wrote:
> 
> Ultimately, it boils down to one thing -- whatever gets the job done.

ahmen brother




_______________________________________________
Redhat-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-list

Reply via email to