Hi,

On Wed, 07.02.2007 at 19:08:46 +0100, Marian Hettwer <[EMAIL PROTECTED]> wrote:
> I had the same problem with botnets, attacking a specific URL. Even 
> sending out 404 errors didn't help at all.
> I wouldn't recommend the pf overload feature, as this depends on the 
> number of tcp connections to your webserver.

> [ mod_security ...]

> Anytime someone is accessing /phpbb2/posting.php the script 
> fill-blacklist.sh is run:
> 
> ([EMAIL PROTECTED] <~> $ cat /root/bin/fill-blacklist.sh

and this doesn't dos the server? I guess in the case you mentioned,
this script must be run _very_ often.

> Pro: Every bot can access the url exactly one time, afterwards its 
> blacklisted.
> Use expire-table to free the pf table occassionally and of course make 
> sure that you don't block yourself - whitelist ip addresses like your 
> standard gateway, otherwise you may DoS yourself ;)

I'm researching the same problem and so far have arrived at the
following conclusions (feedback & improvement desired!):

 * Blacklisting individual IPs is a sharp edged knife, and cumbersome
   to handle.
 * Some request storms appear to be triggered by a unlucky interaction
   between the server sending PDF files, and the client using Internet
   Exploder (which often breaks, see the discussion around
   range-requests).
 * Use a non-forking server.
 * Rate limiting, or at least rate limiting per network (eg. per /16),
   would "solve" the problem for me, and is maintenance-free.
 * Use it with connection rate limiting in pf...

Any comments on this are welcome!

One obvious downside is that one apparently cannot make this work (eg
specifically denying range-requests from IE-users) with the stock
Apache.



Best,
--Toni++

Reply via email to