Request Limiter

2002-01-14 Thread Ken Miller



There was a module floating around a while back that did 
request limiting (aDOS preventional tool). I've searched the 
archives (unsuccessfully), and I was wondering if anyone knows what the heck I'm 
talking about.

I thought it was on Matt Sergeant's web site, but for the life 
of me I can't remember what the url is.

Can someone help? 

My next question would be, if I can't find the module, is what 
phase would I place a request limiter? Should it just go at the head of 
the PerlHandler chain, or earlier in the request phase?

(I do have 'the book', but unfortunately, it's elsewhere right 
now).

Thanks!

 -klm.




Re: Request Limiter

2002-01-14 Thread Geoffrey Young

 Ken Miller wrote:
 
 There was a module floating around a while back that did request
 limiting (a DOS preventional tool).  I've searched the archives
 (unsuccessfully), and I was wondering if anyone knows what the heck
 I'm talking about.

maybe you had Stonehenge::Throttle in mind?

http://www.stonehenge.com/merlyn/LinuxMag/col17.html

 
 I thought it was on Matt Sergeant's web site, but for the life of me
 I can't remember what the url is.
 
 Can someone help?
 
 My next question would be, if I can't find the module, is what phase
 would I place a request limiter?  Should it just go at the head of
 the PerlHandler chain, or earlier in the request phase?

PerlHandlers are for delivering content.  PerlAccessHandlers are for
restricting access.  If you're really feeling the load, you can use a
PerlPostReadRequestHandler, which serves as kinda a general-purpose
stage that occurs early on, as to nab the bad requests as early as
possible.

HTH

--Geoff



Re: Request Limiter

2002-01-14 Thread Mark Maunder

Geoffrey Young wrote:

  Ken Miller wrote:
 
  There was a module floating around a while back that did request
  limiting (a DOS preventional tool).  I've searched the archives
  (unsuccessfully), and I was wondering if anyone knows what the heck
  I'm talking about.

 maybe you had Stonehenge::Throttle in mind?


I wrote something a while back in response to users holding down the F5
key in IE and DOS'ing our website. It's called Apache::GateKeeper and is
more polite than Throttle in that it serves cached content to the client
instead of sending a 'come back later' message. It's configurable so after
exceeding a threshold the client gets content from the shared memory
cache, and if a second threshold is exceeded (ok this guy is getting
REALLY irritating) then they get the 'come back later' message. They will
only get cached content if they exceed x number of requests within y
number of seconds.

It works with Apache::Filter and there are two components -
Apache::GateKeeper which is the first handler in the line of filters, and
Apache::GateKeeper::Gate, which is the last in the line of filters and
does the caching of content which will be served to the client if they are
naughty.

I would have liked to write this so that it just drops into an existing
mod_perl app, but I couldn't find a way to grab an application's output
before it got sent to the client for storage in the cache, so I set it up
with Apache::Filter. Any suggestions on how to solve this?

I've put the source on http://www.swiftcamel.com/gatekeeper.tgz

It isn't packaged at all, and only includes the two modules I've grabbed
straight out of our app - Apache::GateKeeper and Apache::GateKeeper::Gate.
Currently this uses pnotes to pass POST data and messages between modules
that are in the Apache::Filter chain, so it's really not the kind of thing
you can drop into an app.

Any ideas on how to write a version of this that one CAN simply drop into
an existing application would be most welcome.

~mark.




Re: Request Limiter

2002-01-14 Thread Perrin Harkins

 It's configurable so after
 exceeding a threshold the client gets content from the shared memory
 cache, and if a second threshold is exceeded (ok this guy is getting
 REALLY irritating) then they get the 'come back later' message. They will
 only get cached content if they exceed x number of requests within y
 number of seconds.

Nice idea.  I usually prefer to just send an ACCESS DENIED if someone is
behaving badly, but a cached page might be better for some situations.

How do you determine individual users?  IP can be a problem with large
proxies.  At eToys we used the session cookie if available (we could verify
that it was not faked by using a message digest) and wold fall back to the
IP if there was no cookie.

 Any ideas on how to write a version of this that one CAN simply drop into
 an existing application would be most welcome.

It's hard to do that without making assumptions about the way to cache the
content.  Personally, I prefer to make this kind of thing an AccessHandler
rather than using Apache::Filter, but your approach makes sense for you
method of caching.

- Perrin




RE: Request Limiter

2002-01-14 Thread Christian Gilmore

If you're looking for limiting simultaneous requests to a URI resource
(and not the entire server, which can be handled by MaxClients), you may
be looking for mod_throttle_access. It can be found at
http://modules.apache.org/search?id=232.

Regards,
Christian

-
Christian Gilmore
Team Lead
Web Infrastructure  Tools
IBM Software Group


-Original Message-
From: Ken Miller [mailto:[EMAIL PROTECTED]]
Sent: Monday, January 14, 2002 12:14 PM
To: [EMAIL PROTECTED]
Subject: Request Limiter


There was a module floating around a while back that did request limiting
(a DOS preventional tool).  I've searched the archives (unsuccessfully),
and I was wondering if anyone knows what the heck I'm talking about.

I thought it was on Matt Sergeant's web site, but for the life of me I
can't remember what the url is.

Can someone help?

My next question would be, if I can't find the module, is what phase would
I place a request limiter?  Should it just go at the head of the
PerlHandler chain, or earlier in the request phase?

(I do have 'the book', but unfortunately, it's elsewhere right now).

Thanks!

-klm.




Re: Request Limiter

2002-01-14 Thread Mark Maunder

Perrin Harkins wrote:

  It's configurable so after
  exceeding a threshold the client gets content from the shared memory
  cache, and if a second threshold is exceeded (ok this guy is getting
  REALLY irritating) then they get the 'come back later' message. They will
  only get cached content if they exceed x number of requests within y
  number of seconds.

 Nice idea.  I usually prefer to just send an ACCESS DENIED if someone is
 behaving badly, but a cached page might be better for some situations.

 How do you determine individual users?  IP can be a problem with large
 proxies.  At eToys we used the session cookie if available (we could verify
 that it was not faked by using a message digest) and wold fall back to the
 IP if there was no cookie.


I'm also using cookies with a digest. There's also the option of using the IP
instead which I added in as an afterthought since my site requires cookie
support.  But I have nighmares of large corporate proxies seeing the same page
over and over.

I wonder if this would be easier to implement as a drop-in with mod_perl2 since
filters are supposed to be replacing handlers? And while I'm at it, is there a
mod_perl 2 users (or testers) mailing list yet?