> On 20/03/12 20:11, Stephan Schulz wrote: >> Hi all! >> >> When mirroring sites I limit the download rate which in turn results in durations longer than 24 hours. I'm on a DSL line which is interrupted every 24 hours so there is a network downtime where wget gets unresolved hostnames. The problem is that files which are tried during this downtime >> are not tried again regardless of --tried. How can I force wget into retrying them? >> >> >> Best regards, >> Stephan > Is that 24h period a fixed time? Can you know in advance the point at which > the following downtime will happen? (because you annotate the point of the last downtime, it's always at midnight, etc) > > In that case, it'd be quite easy to prepare a script which stops your wget a few > minutes before the downtime and continues them after the network has resumed. > The downtime would then be instantaneus for wget, and avoid your issue.
The 24h period is fixed but unfortunately such a script would be very fragile. E.g. it could happen that the connection is lost during failure on the providers side. There I hoped wget could be forced into retrying to resolve hostnames more than only one time. If thats not possible I'll try to hack something in the wget source together as this looks more reliable to me than a script which tries to control wget externally. Best regards, Stephan
