On Fri, Oct 24, 2008 at 08:29:43PM +0100, Mário Gamito wrote:
> Hi,
> 
> I have this site that has a directory with some files.
> A few weeks ago, two web bots started sucking those files at an impressive 
> rate.


Use a robots.txt file in your home directory.

http://www.robotstxt.org/
http://en.wikipedia.org/wiki/Robots.txt

If they ignore it, then use iptables to block them.  That takes the
strain off httpd.


-- 
/*********************************************************************\
**
** Joe Yao                              [EMAIL PROTECTED] - Joseph S. D. Yao
**
\*********************************************************************/

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to