I've got a robots.txt, and a script that loops to infinity.
Actually, it's a useful page on the server, there's a list that can be
ordered two ways, and switching from one to the other increments a parameter
at the end of the invocation.

A robot has no business reading that specific page in the first place (in
fact, they're disallowed to), and after a small number of loops (10 or 15),
the webserver becomes very unresponsive, thus ensuring the robot writer will
lose a lot of time on that page.

Assuming reasonable technologies (e.g., mason), the url does not even have
to look like a script...

Reply via email to