Does WGet have a "pause" functionality where you can
pause a current download ?
Will the file currently being downloaded be corrupted
if I kill WGet and spawn another one after a few
minutes asking it to resume from where it was forced
to stop last time ?

An accompanying (shell) script would be nice for
illustratory purposes, though it need not be
syntactically correct ;)
Should just highlight the basics - and note that it
should be as much automated as possible.
(Or for that matter you can submit pseudocode, or just
details in plain ol' english !)

If you feel like piping to 'sed' to check server
headers for 'Range' - use it by all means in the
script, but show it please.

Basically, I need to do automated downloads of HUGE
(>=100MB) files using WGet where human interaction
will be minimal - every 48 hours or so at the least!
So WGet asking the user incase of problems etc is not
a very good idea ! It should be as much automated as
possible.

Currently, I am using the following:

./wget --output-file=test_txt --append-output=test_txt
--tries=20 --continue --header="From:
SuSI<[EMAIL PROTECTED]>"
http://www.mosfet.com/downloads/something
But as you know, WGet will just REFUSE to d/l files in
'continue' mode if the server does not support
resuming.
(For example, http://www.mosfet.com maybe load
balanced to 3 - 4 servers, and one of them may/may not
support resuming. So everytime I stop/start/resume a
d/l, I have to chcek afresh if the server starts
resuming, as I may get connected to a different server
now - say one which does NOT support resuming..)

I am allowed to use WGet 1.9.1 on a 512KBps DSL line
with a 128KBPs satellite link fallback incase of
problems - so you may take these parameters if you can
optmize I/O.

Regards

Subhobroto Sinha
http://www.geocities.com/subhobrotosinha


        
                
__________________________________
Do you Yahoo!?
New and Improved Yahoo! Mail - 100MB free storage!
http://promotions.yahoo.com/new_mail 

Reply via email to