Charles Michener wrote:
I have a couple of spider bots hitting my server that I do not wish
to have access to my pages - they ignore robots.txt, so I finally
put them on my 'deny from x' list. This does deny them access
but they persist to keep trying - trying each page address at least
You are stopping them inside apache now. Next obvious step is a
firewall. Either on the server on a dedicated box in front
of it.
regs,
Christian
On Sat, Dec 15, 2007 at 12:57:17PM -0800, Charles Michener wrote:
I have a couple of spider bots hitting my server that I do not wish to have
I have a couple of spider bots hitting my server that I do not wish to have
access to my pages - they ignore robots.txt, so I finally put them on my 'deny
from x' list. This does deny them access but they persist to keep trying -
trying each page address at least 30 times - several hits per