We've had some instances where crawlers have stumbled onto a cgi script
which refers to itself and start pounding the server with requests to
that cgi.
There are so many CGI scripts on this server that I don't want to
maintain a huge robots.txt file. Any suggestions on other techniques to
keep
independent of the OS and file
extensions and associations? If that's true, perhaps that might lead to some
solution to your problem.
Mark
Original Message
Subject: [EMAIL PROTECTED] Blocking crawling of CGIs
From: Tony Rice (trice) [EMAIL PROTECTED]
To: users@httpd.apache.org
Date