On Sat, Jun 08, 2002 at 10:30:02PM -0500, Amy Rupp wrote:
On some sites I cannot download files.
On one such site I found this file robots.txt.
Is this file the cause for wget not downloading the files.
Why not just put robots=off in your .wgetrc?
Read the docs, people! That's why many hours
On Sat, Jun 08, 2002 at 10:58:16PM -0500, Amy Rupp wrote:
Why not just put robots=off in your .wgetrc?
With all due respect, I *DID* read the documentation, which I quoted,
and I attempted to find the latest version available. I didn't post
the original question, and *MY* question still
On Mon, Apr 15, 2002 at 04:20:58AM +0200, Hrvoje Niksic wrote:
As suggested by Alan E, this patch extends the meaning of timeout to
include DNS lookups. After this patch, I can't think of any network
operation still allowed to take more than the specified timeout
period.
Cool. Thanks. That's
It would be a lot easier to report this shit to SpamCop if the mailing
list software didn't strip the incoming headers.
--
AlanE
On Sunday 14 April 2002 01:23, you wrote:
Alan E [EMAIL PROTECTED] writes:
Does it allow custom rules, such as bonus points for mails that
mention wget or debug log in the body?
AFAIK, yes. you can put your rules in the local configuration file.
--
AlanE
On Thursday 21 February 2002 05:44, Ian Abbott wrote:
On 21 Feb 2002 at 1:31, Alan Eldridge wrote:
You can't get it to work for timing out a socket connection, because
that is a bit of code that hasn't been implemented yet.
If no one else wants to, I can work up a patch for this next