> "Christian" == Christian Gilmore <[EMAIL PROTECTED]> writes:
Christian> Hi, Drew.
>> I came across the very problem you're having. I use mod_bandwidth, its
>> actively maintained, allows via IP, directory or any number of ways to
>> monitor bandwidth usage http://www.cohprog.com/mod_bandwid
Hi, Drew.
> I came across the very problem you're having. I use mod_bandwidth, its
> actively maintained, allows via IP, directory or any number of ways to
> monitor bandwidth usage http://www.cohprog.com/mod_bandwidth.html
The size of the data sent through the pipe doesn't reflect the CPU spent
Hi, Jeremy.
> I looked at the page you mentioned below. It wasn't really
> clear on the page, but what happens when the requests get above
> the max allowed? Are the remaining requests queued or are they
> simply given some kind of error message?
The service will respond with an HTTP 503 messa
ot;kyle dawkins" <[EMAIL PROTECTED]>
> To: "Peter Bi" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
> Sent: Friday, April 19, 2002 8:29 AM
> Subject: Re: Throttling, once again
>
> > Peter
> >
> > Storing the last access time, etc in a cookie
How about adding a MD5 watermark for the cookie ? Well, it is becoming
complicated
Peter Bi
- Original Message -
From: "kyle dawkins" <[EMAIL PROTECTED]>
To: "Peter Bi" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Friday, April 19, 2002 8:29
ssage-
From: Jeremy Rusnak [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 12:06 PM
To: Christian Gilmore; [EMAIL PROTECTED]
Subject: RE: Throttling, once again
Hi,
I looked at the page you mentioned below. It wasn't really
clear on the page, but what happens when the requests get
eremy
-Original Message-
From: Christian Gilmore [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 8:31 AM
To: 'Bill Moseley'; [EMAIL PROTECTED]
Subject: RE: Throttling, once again
Bill,
If you're looking to throttle access to a particular URI (or set of URIs),
giv
On 19-Apr-2002 Bill Moseley wrote:
> Also, does anyone have suggestions for testing once throttling is in place?
> I don't want to start cutting off the good customers, but I do want to get
> an idea how it acts under load. ab to the rescue, I suppose.
wget supports recursive spidering. Or try
Hi Bill,
> Wasn't there just a thread on throttling a few weeks ago?
There have been many. Here's my answer to one of them:
004101c0f2cc$9d14a540$[EMAIL PROTECTED]">http://mathforum.org/epigone/modperl/blexblolgang/004101c0f2cc$9d14a540$[EMAIL PROTECTED]
> Anyway, I remember Randal's Stonehen
ology Leader
GeT WW Global Applications Development
IBM Software Group
-Original Message-
From: Bill Moseley [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 12:56 AM
To: [EMAIL PROTECTED]
Subject: Throttling, once again
Hi,
Wasn't there just a thread on throttling a few weeks a
/memory for
> history.
>
> Peter Bi
>
>
> - Original Message -
> From: "kyle dawkins" <[EMAIL PROTECTED]>
> To: <[EMAIL PROTECTED]>
> Sent: Friday, April 19, 2002 8:02 AM
> Subject: Re: Throttling, once again
>
> > Guys
> >
>
PU or
bandwidth throttles. In the later cases, one has to call DB/file/memory for
history.
Peter Bi
- Original Message -
From: "kyle dawkins" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Friday, April 19, 2002 8:02 AM
Subject: Re: Throttling, once again
> Guys
&
Guys
We also have a problem with evil clients. It's not always spiders... in fact
more often than not it's some smart-ass with a customised perl script
designed to screen-scrape all our data (usually to get email addresses for
spam purposes).
Our solution, which works pretty well, is to have
When this happened to our clients servers we ended up trying some of the
mod_perl based solutions. We tried some of the modules that used shared
memory, but the traffic on our site quickly filled our shared memory and
made the module unuseable. After that we tried blocking the agents
altogether,
kind of thing,
but the Apache module makes it so much nicer.
Jeremy
-Original Message-
From: Bill Moseley [mailto:[EMAIL PROTECTED]]
Sent: Thursday, April 18, 2002 10:56 PM
To: [EMAIL PROTECTED]
Subject: Throttling, once again
Hi,
Wasn't there just a thread on throttling a few week
On Friday 19 April 2002 6:55 am, Bill Moseley wrote:
> Hi,
>
> Wasn't there just a thread on throttling a few weeks ago?
>
> I had a machine hit hard yesterday with a spider that ignored robots.txt.
I thought the standard practice these days was to put some URL at an
un-reachable place (by a hum
Hi,
Wasn't there just a thread on throttling a few weeks ago?
I had a machine hit hard yesterday with a spider that ignored robots.txt.
Load average was over 90 on a dual CPU Enterprise 3500 running Solaris 2.6.
It's a mod_perl server, but has a few CGI scripts that it handles, and the
spide
17 matches
Mail list logo