I have a couple of spider bots hitting my server that I do not wish to have 
access to my pages - they ignore robots.txt, so I finally put them on my 'deny 
from xxxxx' list. This does deny them access but they persist to keep trying - 
trying each page address at least 30 times - several hits per second .  Is 
there a standard method to forward them to some black hole or the FBI or ...?

Charles

       
---------------------------------
Be a better friend, newshound, and know-it-all with Yahoo! Mobile.  Try it now.

Reply via email to