[email protected] (User Goblin) writes:

> The situation: I'm trying to resume a large recursive download of a site
> with many files (-r -l 10 -c)
>
> The problem: When resuming, wget issues a large number of HEAD requests
> for each file that it already downloaded. This triggers the upstream firewall,
> making the download impossible.

have you had a look at --wait, --waitretry and --random-wait?

Maybe this is enough for circumventing your firewall, even though it
will slow down the download process.

Regards,
Giuseppe

Reply via email to