-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Pierre,

> I am trying to implement a flood control mechanism to prevent robots
> requesting pages after pages at an "inhuman" rate.

I know you've gotten lots of feedback already, but there's a
super-simple way to do this: put a marker in the request attributes the
first time your filter "sees" it. Check for it each time. When you place
the marker in the request, perform all your magic: check the queue, add
the current request + timestamp, etc. If the marker is already there,
skip everything.

For redirects, the request should be re-used, so the marker should
remain until your final response.

- -chris

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFF9shh9CaO5/Lv0PARAu44AJ4hIVOFv/mtsYZeJBD4lVf28hpYJgCfVVzx
XmwRPjAbuG9qfUgvIO4hkTs=
=KOGU
-----END PGP SIGNATURE-----

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to