RE: Throttling, once again

2002-04-19 Thread Drew Wymore

I came across the very problem you're having. I use mod_bandwidth, its
actively maintained, allows via IP, directory or any number of ways to
monitor bandwidth usage http://www.cohprog.com/mod_bandwidth.html

Although its not mod_perl related I hope that this helps 
Drew
-Original Message-
From: Jeremy Rusnak [mailto:[EMAIL PROTECTED]] 
Sent: Friday, April 19, 2002 12:06 PM
To: Christian Gilmore; [EMAIL PROTECTED]
Subject: RE: Throttling, once again

Hi,

I looked at the page you mentioned below.  It wasn't really
clear on the page, but what happens when the requests get above
the max allowed?  Are the remaining requests queued or are they
simply given some kind of error message?

There seem to be a number of different modules for this kind of
thing, but most of them seem to be fairly old.  We could use a
more currently throttling module that combines what others have
come up with.  

For example, the snert.com mod_throttle is nice because it does
it based on IP - but it does it site wide in that mode.  This
mod_throttle seems nice because it can be set for an individual
URI...But that's a pain for sites like mine that have 50 or
more intensive scripts (by directory would be nice).  And still
both of these approaches don't use cookies like some of the
others to make sure that legit proxies aren't blocked.

Jeremy

-Original Message-
From: Christian Gilmore [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 8:31 AM
To: 'Bill Moseley'; [EMAIL PROTECTED]
Subject: RE: Throttling, once again


Bill,

If you're looking to throttle access to a particular URI (or set of
URIs),
give mod_throttle_access a look. It is available via the Apache Module
Registry and at http://www.fremen.org/apache/mod_throttle_access.html .

Regards,
Christian

-
Christian Gilmore
Technology Leader
GeT WW Global Applications Development
IBM Software Group


-Original Message-
From: Bill Moseley [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 19, 2002 12:56 AM
To: [EMAIL PROTECTED]
Subject: Throttling, once again


Hi,

Wasn't there just a thread on throttling a few weeks ago?

I had a machine hit hard yesterday with a spider that ignored
robots.txt.

Load average was over 90 on a dual CPU Enterprise 3500 running Solaris
2.6.
 It's a mod_perl server, but has a few CGI scripts that it handles, and
the
spider was hitting one of the CGI scripts over and over.  They were
valid
requests, but coming in faster than they were going out.

Under normal usage the CGI scripts are only accessed a few times a day,
so
it's not much of a problem have them served by mod_perl.  And under
normal
peak loads RAM is not a problem.

The machine also has bandwidth limitation (packet shaper is used to
share
the bandwidth).  That combined with the spider didn't help things.
Luckily
there's 4GB so even at a load average of 90 it wasn't really swapping
much.
 (Well not when I caught it, anyway).  This spider was using the same IP
for all requests.

Anyway, I remember Randal's Stonehenge::Throttle discussed not too long
ago.  That seems to address this kind of problem.  Is there anything
else
to look into?  Since the front-end is mod_perl, it mean I can use
mod_perl
throttling solution, too, which is cool.

I realize there's some fundamental hardware issues to solve, but if I
can
just keep the spiders from flooding the machine then the machine is
getting
by ok.

Also, does anyone have suggestions for testing once throttling is in
place?
 I don't want to start cutting off the good customers, but I do want to
get
an idea how it acts under load.  ab to the rescue, I suppose.

Thanks much,


--
Bill Moseley
mailto:[EMAIL PROTECTED]




[ModPerl causing segfaults]

2002-04-11 Thread Drew Wymore








I built apache using mod_perl as a
DSO module. When I start apache and someone attempts to access my website, it
works but only for an instant and then they receive page cannot be
displayed, but only with html documents, php
docs show up fine. When I looked into my logs this what I found 

[Wed Apr 10 18:05:34 2002] [notice] child pid 27804 exit
signal Segmentation fault (11)



In tracing back and asking a few questions in IRC, I
commented out the loading of the mod_perl module and
this behavior stops happening. Any good pointers as to where I can find a
solution to this problem?



Reference

Apache 1.3.19

Mod_perl 1.26

Php 4.1.2



Thanks,

Drew








RE: [ModPerl causing segfaults]

2002-04-11 Thread Drew Wymore

Thank you Doug this appears to have cleaned up the situation very
nicely. :)

Drew

-Original Message-
From: Doug MacEachern [mailto:[EMAIL PROTECTED]] 
Sent: Thursday, April 11, 2002 10:26 AM
To: Drew Wymore
Cc: [EMAIL PROTECTED]
Subject: Re: [ModPerl causing segfaults]

sounds like the largefiles issue, you should have seen this warning
during 
the build:

Your Perl is uselargefiles enabled, but Apache is not, suggestions:
*) Rebuild mod_perl with Makefile.PL PERL_USELARGEFILES=0
*) Rebuild Apache with CFLAGS=-D_LARGEFILE_SOURCE
-D_FILE_OFFSET_BITS=64
*) Rebuild Perl with Configure -Uuselargefiles
*) Let mod_perl build Apache (USE_DSO=1 instead of USE_APXS=1)

easiest fix is the 1st option.