Christian == Christian Gilmore [EMAIL PROTECTED] writes:
Christian Hi, Drew.
I came across the very problem you're having. I use mod_bandwidth, its
actively maintained, allows via IP, directory or any number of ways to
monitor bandwidth usage http://www.cohprog.com/mod_bandwidth.html
Hi, Jeremy.
I looked at the page you mentioned below. It wasn't really
clear on the page, but what happens when the requests get above
the max allowed? Are the remaining requests queued or are they
simply given some kind of error message?
The service will respond with an HTTP 503 message
Hi, Drew.
I came across the very problem you're having. I use mod_bandwidth, its
actively maintained, allows via IP, directory or any number of ways to
monitor bandwidth usage http://www.cohprog.com/mod_bandwidth.html
The size of the data sent through the pipe doesn't reflect the CPU spent
Hi,
I *HIGHLY* recommend mod_throttle for Apache. It is very
configurable. You can get the software at
http://www.snert.com/Software/mod_throttle/index.shtml .
The best thing about it is the ability to throttle based
on bandwidth and client IP. We had problems with robots
as well as
When this happened to our clients servers we ended up trying some of the
mod_perl based solutions. We tried some of the modules that used shared
memory, but the traffic on our site quickly filled our shared memory and
made the module unuseable. After that we tried blocking the agents
Guys
We also have a problem with evil clients. It's not always spiders... in fact
more often than not it's some smart-ass with a customised perl script
designed to screen-scrape all our data (usually to get email addresses for
spam purposes).
Our solution, which works pretty well, is to have
or
bandwidth throttles. In the later cases, one has to call DB/file/memory for
history.
Peter Bi
- Original Message -
From: kyle dawkins [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Friday, April 19, 2002 8:02 AM
Subject: Re: Throttling, once again
Guys
We also have a problem with evil clients
Bill,
If you're looking to throttle access to a particular URI (or set of URIs),
give mod_throttle_access a look. It is available via the Apache Module
Registry and at http://www.fremen.org/apache/mod_throttle_access.html .
Regards,
Christian
-
Christian Gilmore
Technology
Message-
From: Christian Gilmore [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 8:31 AM
To: 'Bill Moseley'; [EMAIL PROTECTED]
Subject: RE: Throttling, once again
Bill,
If you're looking to throttle access to a particular URI (or set of URIs),
give mod_throttle_access a look
-
From: Jeremy Rusnak [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 12:06 PM
To: Christian Gilmore; [EMAIL PROTECTED]
Subject: RE: Throttling, once again
Hi,
I looked at the page you mentioned below. It wasn't really
clear on the page, but what happens when the requests get above
How about adding a MD5 watermark for the cookie ? Well, it is becoming
complicated
Peter Bi
- Original Message -
From: kyle dawkins [EMAIL PROTECTED]
To: Peter Bi [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Friday, April 19, 2002 8:29 AM
Subject: Re: Throttling, once again
Peter
On Friday 19 April 2002 6:55 am, Bill Moseley wrote:
Hi,
Wasn't there just a thread on throttling a few weeks ago?
I had a machine hit hard yesterday with a spider that ignored robots.txt.
I thought the standard practice these days was to put some URL at an
un-reachable place (by a human),
12 matches
Mail list logo