Re: Throttling, once again

2002-04-25 Thread Randal L. Schwartz
Christian == Christian Gilmore [EMAIL PROTECTED] writes: Christian Hi, Drew. I came across the very problem you're having. I use mod_bandwidth, its actively maintained, allows via IP, directory or any number of ways to monitor bandwidth usage http://www.cohprog.com/mod_bandwidth.html

RE: Throttling, once again

2002-04-22 Thread Christian Gilmore
Hi, Jeremy. I looked at the page you mentioned below. It wasn't really clear on the page, but what happens when the requests get above the max allowed? Are the remaining requests queued or are they simply given some kind of error message? The service will respond with an HTTP 503 message

RE: Throttling, once again

2002-04-22 Thread Christian Gilmore
Hi, Drew. I came across the very problem you're having. I use mod_bandwidth, its actively maintained, allows via IP, directory or any number of ways to monitor bandwidth usage http://www.cohprog.com/mod_bandwidth.html The size of the data sent through the pipe doesn't reflect the CPU spent

RE: Throttling, once again

2002-04-19 Thread Jeremy Rusnak
Hi, I *HIGHLY* recommend mod_throttle for Apache. It is very configurable. You can get the software at http://www.snert.com/Software/mod_throttle/index.shtml . The best thing about it is the ability to throttle based on bandwidth and client IP. We had problems with robots as well as

Re: Throttling, once again

2002-04-19 Thread Marc Slagle
When this happened to our clients servers we ended up trying some of the mod_perl based solutions. We tried some of the modules that used shared memory, but the traffic on our site quickly filled our shared memory and made the module unuseable. After that we tried blocking the agents

Re: Throttling, once again

2002-04-19 Thread kyle dawkins
Guys We also have a problem with evil clients. It's not always spiders... in fact more often than not it's some smart-ass with a customised perl script designed to screen-scrape all our data (usually to get email addresses for spam purposes). Our solution, which works pretty well, is to have

Re: Throttling, once again

2002-04-19 Thread Peter Bi
or bandwidth throttles. In the later cases, one has to call DB/file/memory for history. Peter Bi - Original Message - From: kyle dawkins [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Friday, April 19, 2002 8:02 AM Subject: Re: Throttling, once again Guys We also have a problem with evil clients

RE: Throttling, once again

2002-04-19 Thread Christian Gilmore
Bill, If you're looking to throttle access to a particular URI (or set of URIs), give mod_throttle_access a look. It is available via the Apache Module Registry and at http://www.fremen.org/apache/mod_throttle_access.html . Regards, Christian - Christian Gilmore Technology

RE: Throttling, once again

2002-04-19 Thread Jeremy Rusnak
Message- From: Christian Gilmore [mailto:[EMAIL PROTECTED]] Sent: Friday, April 19, 2002 8:31 AM To: 'Bill Moseley'; [EMAIL PROTECTED] Subject: RE: Throttling, once again Bill, If you're looking to throttle access to a particular URI (or set of URIs), give mod_throttle_access a look

RE: Throttling, once again

2002-04-19 Thread Drew Wymore
- From: Jeremy Rusnak [mailto:[EMAIL PROTECTED]] Sent: Friday, April 19, 2002 12:06 PM To: Christian Gilmore; [EMAIL PROTECTED] Subject: RE: Throttling, once again Hi, I looked at the page you mentioned below. It wasn't really clear on the page, but what happens when the requests get above

Re: Throttling, once again

2002-04-19 Thread Peter Bi
How about adding a MD5 watermark for the cookie ? Well, it is becoming complicated Peter Bi - Original Message - From: kyle dawkins [EMAIL PROTECTED] To: Peter Bi [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: Friday, April 19, 2002 8:29 AM Subject: Re: Throttling, once again Peter

Re: Throttling, once again

2002-04-18 Thread Matt Sergeant
On Friday 19 April 2002 6:55 am, Bill Moseley wrote: Hi, Wasn't there just a thread on throttling a few weeks ago? I had a machine hit hard yesterday with a spider that ignored robots.txt. I thought the standard practice these days was to put some URL at an un-reachable place (by a human),