Charles Michener wrote:
I have a couple of spider bots hitting my server that I do not wish
to have access to my pages - they ignore robots.txt, so I finally
put them on my 'deny from xxxxx' list. This does deny them access
but they persist to keep trying - trying each page address at least
30 times - several hits per second . Is there a standard method to
forward them to some black hole or the FBI or ...?
---------------- End original message. ---------------------
This is the kind of thing a router/firewall will handle for you.
Stopping these requests before they get to your machine is the best
way to handle them. Otherwise, it doesn't really have a lot of impact
on the performance of the server for it to send a forbidden response
back to the offenders. Yeah, it takes a little bit of processing but
it is pretty insignificant per request.
Hopefully they will eventually give up but if they don't, look into
using a firewall to deny at the edge of your network.
Dragon
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Venimus, Saltavimus, Bibimus (et naribus canium capti sumus)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
" from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]