>>>>> "Gerald" == Gerald Richter <[EMAIL PROTECTED]> writes:
Gerald> Maybe Apache::SpeedLimit is helpfull. It limits the number of pages one
Gerald> client can fetch in per time. There a other Apache modules to block robots,
Gerald> look at the Apache module list.
My CPU-based limiter is working quite nicely. It lets oodles of
static pages be served, but if someone starts doing CPU intensive
stuff, they get booted for hogging my server machine. The nice thing
is that I return a standard "503" error including a "retry-after", so
if it is a legitimate mirroring program, it'll know how to deal with
the error.
Doug - one thing I noticed is that mod_cgi isn't charging the
child-process time to the server anywhere between post-read-request
and log phases. Does that mean there's no "wait" or "waitpid" until
cleanup?
Also, Doug, can there be only one $r->cleanup_handler? I was getting
intermittent results until I changed my ->cleanup_handler into a
push'ed loghandler. I also use ->cleanup_handler in other modules, so
I'm wondering if there's a conflict.
I also added a DBILogger that logs CPU times, so I can see which pages
on my system are burning the most CPU, and even tell which hosts suck
down the most CPU in a day. mod_perl rules!
--
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
<[EMAIL PROTECTED]> <URL:http://www.stonehenge.com/merlyn/>
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!