Another idea might be to sleep proportional to the length of the URL. For example, 1k request get 1 second, a 2k request gets 2 seconds. At least this will slow them down.

Fortunately, these attacks are not DOS, just looking around for boxes to 0wn. So it's just a matter of rotating the logs properly, and compressing them if I want to keep the old logs around. (At this rate, a year's worth of uncompressed logs would consume my hard drive.)


And, of course, the problem of viewing the logs, which is where this thread is closest to being on topic here. Writing a viewer will help me understand the perl approach to parsing characters. Not that I don't understand it, but it's hard for a guy who essentially cut his teeth porting Forth to get used to.

--
Joel Rees



Reply via email to