cvs commit: modperl-2.0/lib/Apache Build.pm

2002-05-20 Thread dougm

dougm   02/05/20 15:27:30

  Modified:lib/Apache Build.pm
  Log:
  use Apache::TestConfig::which instead of duplicating
  
  Revision  ChangesPath
  1.85  +2 -7  modperl-2.0/lib/Apache/Build.pm
  
  Index: Build.pm
  ===
  RCS file: /home/cvs/modperl-2.0/lib/Apache/Build.pm,v
  retrieving revision 1.84
  retrieving revision 1.85
  diff -u -r1.84 -r1.85
  --- Build.pm  19 May 2002 19:46:08 -  1.84
  +++ Build.pm  20 May 2002 22:27:30 -  1.85
   -13,6 +13,7 
   use ModPerl::Code ();
   use ModPerl::BuildOptions ();
   use Apache::TestTrace;
  +use Apache::TestConfig ();
   
   use constant REQUIRE_ITHREADS = grep { $^O eq $_ } qw(MSWin32);
   use constant HAS_ITHREADS =
   -96,7 +97,7 
   #these extra tries are for things built outside of mod_perl
   #e.g. libapreq
   push trys,
  -which('apxs'),
  +Apache::TestConfig::which('apxs'),
   '/usr/local/apache/bin/apxs';
   }
   
   -133,12 +134,6 
   my $cflags = __PACKAGE__-apxs('-q' = 'CFLAGS');
   $cflags =~ s/\/\\\/g;
   $cflags;
  -}
  -
  -sub which {
  -foreach (map { File::Spec-catfile($_, $_[0]) } File::Spec-path) {
  - return $_ if -x;
  -}
   }
   
   #--- Perl Config stuff ---
  
  
  



cvs commit: modperl-2.0/xs/Apache/RequestRec Apache__RequestRec.h

2002-05-20 Thread dougm

dougm   02/05/20 17:07:01

  Modified:xs/Apache/RequestRec Apache__RequestRec.h
  Log:
  include string terminator when calling ap_set_content_type
  
  Revision  ChangesPath
  1.2   +1 -1  modperl-2.0/xs/Apache/RequestRec/Apache__RequestRec.h
  
  Index: Apache__RequestRec.h
  ===
  RCS file: /home/cvs/modperl-2.0/xs/Apache/RequestRec/Apache__RequestRec.h,v
  retrieving revision 1.1
  retrieving revision 1.2
  diff -u -r1.1 -r1.2
  --- Apache__RequestRec.h  19 May 2002 20:09:27 -  1.1
  +++ Apache__RequestRec.h  21 May 2002 00:07:00 -  1.2
   -7,7 +7,7 
   if (type) {
   STRLEN len;
   const char *val = SvPV(type, len);
  -ap_set_content_type(r, apr_pmemdup(r-pool, val, len));
  +ap_set_content_type(r, apr_pmemdup(r-pool, val, len+1));
   }
   
   return retval;
  
  
  



cvs commit: modperl-2.0/t/response/TestModperl subenv.pm

2002-05-20 Thread dougm

dougm   02/05/20 19:50:39

  Added:   t/response/TestModperl subenv.pm
  Log:
  tests for $r-subprocess_env
  
  Revision  ChangesPath
  1.1  modperl-2.0/t/response/TestModperl/subenv.pm
  
  Index: subenv.pm
  ===
  package TestModperl::subenv;
  
  use strict;
  use warnings FATAL = 'all';
  
  use Apache::RequestRec ();
  use APR::Table ();
  
  use Apache::Test;
  
  use Apache::Const -compile = 'OK';
  
  sub handler {
  my $r = shift;
  
  plan $r, tests = 16;
  
  my $env = $r-subprocess_env;
  
  ok $env;
  
  ok_false($r, 'REMOTE_ADDR');
  
  $r-subprocess_env; #void context populates
  
  $env = $r-subprocess_env; #table may have been overlayed
  
  ok_true($r, 'REMOTE_ADDR');
  
  $env-set(FOO = 1);
  
  ok_true($r, 'FOO');
  
  $r-subprocess_env(FOO = undef);
  
  ok_false($r, 'FOO');
  
  $r-subprocess_env(FOO = 1);
  
  ok_true($r, 'FOO');
  
  Apache::OK;
  }
  
  sub ok_true {
  my($r, $key) = _;
  
  my $env = $r-subprocess_env;
  
  ok $env-get($key);
  
  ok $env-{$key};
  
  ok $r-subprocess_env($key);
  }
  
  sub ok_false {
  my($r, $key) = _;
  
  my $env = $r-subprocess_env;
  
  ok ! $env-get($key);
  
  ok ! $env-{$key};
  
  ok ! $r-subprocess_env($key);
  }
  
  1;
  __END__
  PerlOptions -SetupEnv
  
  
  
  



Apache::DBI debugging (was: Re: Modifying @INC via startup.pl)

2002-05-20 Thread Per Einar Ellefsen


At 23:36 19.05.2002, Gregory Matthews wrote:
# Initialize the database connections for each child
Apache::DBI-connect_on_init
(DBI:mysql:database=test;host=localhost, user,password,
{ PrintError = 1, # warn() on errors
RaiseError = 0, # don't die on error
AutoCommit = 1, # commit executes immediately }

I am also using the $Apache::DBI::DEBUG = 2; flag to ensure it is working 
properly.

I am NOT seeing the entries in the error_log both when Apache::DBI 
initializes a connection and when it returns one from its cache.

Shouldn't I be able to see a reference to the connection in my error_log 
file?

I checked both my virtual host error_log file and the server error_log 
file. Nothing in either.

When/Where are you setting this flag? I don't have much experience with 
Apache::DBI, so I won't be able to help you much.

You could also try
 DBI-trace(1); #or any other level mentioned in the DBI docs.


-- 
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: Memory Leaks

2002-05-20 Thread F . Xavier Noria

On Sun, 19 May 2002 23:34:24 -0400
Perrin Harkins [EMAIL PROTECTED] wrote:

: Leaks are caused by circular references, the string form of eval (at
: least it used to leak a little), nested closures (sometimes created
: accidentally with the Error module)

I am using the Error module in my current project, what kind of
constructs should one avoid? Is this safe?

my $um = UserManager-new;
# ...
try {
$um-write_user($user);
$um-dbh-commit;
} catch Exception::DB with {
my $e = shift;
debug Exception: $e;
$um-dbh-rollback;
};

-- fxn




Re: Scripts and passwd

2002-05-20 Thread Thomas Klausner

Hi!

On Sun, May 19, 2002 at 10:34:17AM +0200, Per Einar Ellefsen wrote:
 At 10:22 19.05.2002, [EMAIL PROTECTED] wrote:
   I have written scripts to add a user to the passwd and shadow files as well
 as sendmail user files. When I run this script from the command line for
 testing all runs and completes fine. But when I run the script from apache
 via the web interface I designed it for, I get file permission errors on the
 additions to passwd and the rest of the scripts. How can I get the script to
 access those files?
 You're doing something pretty risky there. the passwd/shadow files are only 
 writable by root. So I suppose that when running them from the command line 

You could let the CGI(or mod_perl)-script write the new user info to a normal
file writable by Apache, and then run a root-owned script (via a cronjob)
that reads this file and then modifies the passwd/shadow file.

Depending on the frequency of the updates, you might want to add file
locking, and depending on the security of your whole system, you could add a
MD5 checksum to each entry, so that nobody with access to your filesystem
can add entries to the Apache-writable file.


-- 
 D_OMM  +  http://domm.zsi.at -+
 O_xyderkes |   neu:  Arbeitsplatz   |   
 M_echanen  | http://domm.zsi.at/d/d162.html |
 M_asteuei  ++





Re: Memory Leaks

2002-05-20 Thread Matt Sergeant

On Mon, 20 May 2002, F. Xavier Noria wrote:

 On Sun, 19 May 2002 23:34:24 -0400
 Perrin Harkins [EMAIL PROTECTED] wrote:

 : Leaks are caused by circular references, the string form of eval (at
 : least it used to leak a little), nested closures (sometimes created
 : accidentally with the Error module)

 I am using the Error module in my current project, what kind of
 constructs should one avoid? Is this safe?

 my $um = UserManager-new;
 # ...
 try {
 $um-write_user($user);
   $um-dbh-commit;
 } catch Exception::DB with {
 my $e = shift;
 debug Exception: $e;
 $um-dbh-rollback;
 };

No. $um is caught in a closure, which could potentially leak.

-- 
!-- Matt --
:-Get a smart net/:-




Re: Memory Leaks

2002-05-20 Thread F . Xavier Noria

On Mon, 20 May 2002 10:15:02 +0100 (BST)
Matt Sergeant [EMAIL PROTECTED] wrote:

:  my $um = UserManager-new;
:  # ...
:  try {
:  $um-write_user($user);
:  $um-dbh-commit;
:  } catch Exception::DB with {
:  my $e = shift;
:  debug Exception: $e;
:  $um-dbh-rollback;
:  };
: 
: No. $um is caught in a closure, which could potentially leak.

Wow, thank you, I have that pattern repeated in the code many times.

That is the way I would write that try/catch in Java, where you need to
have $um in the scope of the try and the catch blocks, what is the right
way to write that in Perl/Error.pm?

-- fxn




Re: Setting require in Authentication handler?

2002-05-20 Thread Geoffrey Young



Todd Chapman wrote:

 Can dir_config be used to set 'require' in an authentication handler?


no.  dir_config() provides access to a mod_perl specific table of variables, not 
generic 
Apache configuration directives.

there is no API for setting the Require directive - it needs to be in your httpd.conf.


 I would then return DECLINED do that Apache's Basic auth handler would do
 the heavy lifting of checking the password.

if you're looking to do conditional authentication what you really need to do is a bit 
backward - turn on all authentication hooks using the Require directive then use your 
handler to return OK when you don't want Apache to check the password.  See recipe 
13.5 in 
the cookbook for more information.

the Satisfy any Apache directive may be able to help as well if you're using 
host-based 
criteria to determine whether you want to require a login.

HTH

--Geoff




Re: building mod_perl_1.99_01,

2002-05-20 Thread Per Einar Ellefsen

At 13:47 20.05.2002, H Jayakumar wrote:
Hello anyone,

Iam building mod_perl for NetWare.

The new mod_perl ( 1.99_01 ) has extensions, under the wrapxs and the
xs directories.

I have built mod_perl.so .in the src/modules/perl directory.

What should I do next , to get the complete mod_perl ?

You should follow the instructions at 
http://perl.apache.org/release/docs/2.0/user/install/install.html

It would be interesting to see mod_perl work on Netware. If there are any 
platform-specific steps you come over, please share them!


-- 
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: Memory Leaks

2002-05-20 Thread Matt Sergeant

On Mon, 20 May 2002, F. Xavier Noria wrote:

 On Mon, 20 May 2002 10:15:02 +0100 (BST)
 Matt Sergeant [EMAIL PROTECTED] wrote:

 :  my $um = UserManager-new;
 :  # ...
 :  try {
 :  $um-write_user($user);
 :$um-dbh-commit;
 :  } catch Exception::DB with {
 :  my $e = shift;
 :  debug Exception: $e;
 :  $um-dbh-rollback;
 :  };
 :
 : No. $um is caught in a closure, which could potentially leak.

 Wow, thank you, I have that pattern repeated in the code many times.

 That is the way I would write that try/catch in Java, where you need to
 have $um in the scope of the try and the catch blocks, what is the right
 way to write that in Perl/Error.pm?

I gave up on Error.pm's try/catch syntax a long time ago - I think it's
hidden closure system combined with perl bugs is just too broken for
production use. Instead I use good old eval:

my $um = UserManager-new;
...
eval {
  $um-write_user($user);
  $um-dbh-commit;
};
if ($@  $@-isa('Exception::DB')) {
   debug Exception: $@;
   $um-dbh-rollback;
}

(note: if you expect all exceptions to be references like this, you had
better have a $SIG{__DIE__} handler installed to bless non-blessed
exceptions before re-throwing them - ask me if you need an example of
that)

-- 
!-- Matt --
:-Get a smart net/:-




Re: Memory Leaks

2002-05-20 Thread Mark Fowler

On Mon, 20 May 2002, Matt Sergeant wrote:

 if ($@  $@-isa('Exception::DB')) {
debug Exception: $@;
$um-dbh-rollback;
 }
 
 (note: if you expect all exceptions to be references like this, you had
 better have a $SIG{__DIE__} handler installed to bless non-blessed
 exceptions before re-throwing them

Can't you just use UNIVERSAL's ISA method directly?  

   if (UNIVERSAL::isa($@,'Exception::DB')) {

This of course might fail if you got the string Exception::DB or 
likewise back as an error message.

Alternativly, check if it's blessed

   use Scalar::Util qw(blessed);

   if (blessed($@)  $@-isa('Exception::DB')) {

Later.

Mark.

-- 
s''  Mark Fowler London.pm   Bath.pm
 http://www.twoshortplanks.com/  [EMAIL PROTECTED]
';use Term'Cap;$t=Tgetent Term'Cap{};print$t-Tputs(cl);for$w(split/  +/
){for(0..30){$|=print$t-Tgoto(cm,$_,$y). $w;select$k,$k,$k,.03}$y+=2}




Re: Reloading Library Files

2002-05-20 Thread Ted Prah



Stas Bekman wrote:

 Ted Prah wrote:
  Thanks Drew, I tried that, but it did not work.

 What happends if you add:

 PerlWarn On

 in httpd.conf

 or

 start the script with perl -w?

 any warnings?


I had PerlWarn On and added -w anyway, but there were no errors.


 do you test only this script alone? What happens if you add the package
 declaration and then call it using the full name? e.g.:


Yes, this is the only script (and corresponding library file) that I use
for this test.  When I use the package declaration and make the call
using the full name, the reloads work fine.

The reason I am using the library file is to follow your recommendation
in the mod_perl guide where a script is split into two files to avoid
the nested subroutine problem.  Out of curiosity I tested the sample
code (counter.pl and mylib.pl) and they too did not reload properly
when mylib.pl was modified.  Does the reloading of a modification
of mylib.pl work for you?  I would prefer to use the library file
approach as opposed to the package approach as a lot of our code
uses libraries that are not in packages, but will move to packages if
that is a necessity.

Thank you, I really appreciate your help.


 z.pl - test script which calls entry_point in z_lib.pl
 ---
 #!/usr/local/bin/perl -w
 use strict;

 require '/home/cgi-bin/z_lib.pl';

 My::Z::entry_point();
 ---

 z_lib.pl
 --
 package My::Z;
 use strict;

 sub entry_point {

  my $r = Apache-request;
  $r-content_type('text/html');
  $r-send_http_header;

  print HERE 1;
  #print   HERE 2;

 }

  Ted
 
  Drew Taylor wrote:
 
 
 Have you tried moving the PerlInitHandler  PerlSetVar up and out of the
 Location directive, making it global for the server? I'm not sure that
 would fix it, but it's worth a try.
 
 Drew
 
 At 02:37 PM 5/17/02 -0400, Ted Prah wrote:
 
 I have tried Apache::Reload as well, but I get the same results.
 
 Ted
 
 Drew Taylor wrote:
 
 
 Take a look at Apache::Reload or Apache::StatINC. Reload is more flexible,
 but StatINC has been around a little longer. Both have worked well for me.
 But be sure that you don't use these modules on a production server. :-)
 
 httpd.conf
 ==
 PerlInitHandler Apache::Reload
 
 Drew
 
 At 01:38 PM 5/17/02 -0400, Ted Prah wrote:
 
 Hi,
 
 I am new to mod_perl and am having problems seeing the
 changes made to library files.  Must I restart the server every
 time I change a library file in order to see my changes?  My
 test code and environment is below.
 
 ==
 Drew Taylor  |  Freelance web development using
 http://www.drewtaylor.com/   |  perl/mod_perl/MySQL/postgresql/DBI
 mailto:[EMAIL PROTECTED]   |  Email jobs at drewtaylor.com
 --
 Speakeasy.net: A DSL provider with a clue. Sign up today.
 http://www.speakeasy.net/refer/29655
 ==
 
 
  --
  Ted Prah
  NetCasters, Inc.
  Phone:  978.887.2100 x44
  Fax:  978.887.6750
  [EMAIL PROTECTED]
 

 --

 __
 Stas BekmanJAm_pH -- Just Another mod_perl Hacker
 http://stason.org/ mod_perl Guide --- http://perl.apache.org
 mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
 http://modperlbook.org http://apache.org   http://ticketmaster.com

--
Ted Prah
NetCasters, Inc.
Phone:  978.887.2100 x44
Fax:  978.887.6750
[EMAIL PROTECTED]




Re: Memory Leaks

2002-05-20 Thread Matt Sergeant

On Mon, 20 May 2002, Mark Fowler wrote:

 On Mon, 20 May 2002, Matt Sergeant wrote:

  if ($  $@-isa('Exception::DB')) {
 debug Exception: $;
 $um-dbh-rollback;
  }
 
  (note: if you expect all exceptions to be references like this, you had
  better have a $SIG{__DIE__} handler installed to bless non-blessed
  exceptions before re-throwing them

 Can't you just use UNIVERSAL's ISA method directly?

if (UNIVERSAL::isa($,'Exception::DB')) {

 This of course might fail if you got the string Exception::DB or
 likewise back as an error message.

 Alternativly, check if it's blessed

use Scalar::Util qw(blessed);

if (blessed($)  $@-isa('Exception::DB')) {

Yeah, I know all the tricks. Ultimately it's a matter of how ugly you want
your code to get, and how many external modules you want to rely on (I
believe Scalar::Util is going to be part of 5.8 though).

-- 
!-- Matt --
:-Get a smart net/:-




Re: Setting require in Authentication handler?

2002-05-20 Thread Geoffrey Young



Todd Chapman wrote:

 I need to decide who has access based on the URI. I guess this means I
 can't use Apache's Basic auth module, since I can't dynamically set
 require. 


as I was saying, go ahead and set the Require directive on the Location (or 
whatever) 
that you want to protect.  if a URI comes in that you want to allow _without_ checking 
the 
password just call

$r-set_handlers(PerlAuthenHandler = [\OK]);

which will essentially short-circuit Apache's default authentication mechanism before 
mod_auth gets the chance to step in.  you could do this from a PerlAccessHandler or (I 
suppose) a PerlTransHandler.  you could probably even just return OK from a 
PerlAuthenHandler if $r-uri =~ m/some_ok_uri/ and skip the previous code (though if 
you 
use something other than Require valid-user you'll have to skip the Authorization 
phase as 
well using a similar measure).

basically, mod_perl gives you a hook into authentication that lets you do whatever you 
want - returning OK says that you have validated the user using your own criteria, and 
mod_auth need not run.  returning DECLINED (as you mentioned earlier) allows mod_auth 
to run.

 Does the cookbook have a code sample of checking the password for
 basic authentication?


well, not via .htpasswd files, no.  in general, it doesn't make much sense to use 
mod_perl 
to duplicate the same things that Apache already does for you, since the Apache code 
is 
faster, has had more eyeballs looking at it for longer, etc.  in that sense you 
wouldn't 
want to write your own routine to just check a flat file.  where mod_perl really 
shines 
wrt authentication is with all the other things Perl does well, such as using DBI to 
authenticate against a database, or working with other schemes like SMB or Radius - 
see 
the 25+ Apache::Auth* modules on CPAN for just about anything you could think of.

however, we do describe how to use the mod_perl API to interact with Apache the same 
way 
mod_auth does using $r-get_basic_auth_pw() and $r-not_basic_auth_failure() in a few 
different ways.  you will also find those two methods in the eagle book if you have it.

make sense?

--Geoff






New mod_perl website [Was Re: Modifying @INC via startup.pl]

2002-05-20 Thread Drew Taylor

This is a little OT, but I really love the new look of the website you 
mention below. Major kudos to all those who helped put together the new 
look-n-feel  content.

Drew

At 11:53 PM 5/19/2002 +0200, you wrote:
Thank you very much Gregory, I have patches the online docs.

By the way, the release-ready (almost) site is now at 
http://perl.apache.org/release/

==
Drew Taylor  |  Freelance web development using
http://www.drewtaylor.com/   |  perl/mod_perl/MySQL/postgresql/DBI
mailto:[EMAIL PROTECTED]   |  Email jobs at drewtaylor.com
--
Speakeasy.net: A DSL provider with a clue. Sign up today.
http://www.speakeasy.net/refer/29655
==




Re: Setting require in Authentication handler?

2002-05-20 Thread Geoffrey Young



Todd Chapman wrote:

 That makes sense. I can't use mod_auth because I can't set Require. 


well, if you're saying that you don't have the ability to set the Require directive at 
all 
(as in you don't have access to edit httpd.conf), then you can't run any 
authentication 
handler - mod_auth, mod_perl, or otherwise.  Apache core requires the Require 
directive to 
be set to something before it will even try to run the authen/authz phases of the 
request.

so, you may be out of luck and need to resort to the CGI tricks of yore where 
everything 
is clumped in the content-generation phase (and of which I'm not that familiar).

 I'm
 using Basic authentication and text based password files. Unfortunately, I
 can't find an Apache::Auth* module that handles basic authentication
 against text files. Did I miss it somewhere?


I'm not sure, but it may not exist for the reason I stated eariler about mod_perl not 
duplicating default Apache behavior.  IIRC, there is one that authenticates against 
/etc/passwd, so maybe you can use that as an example of flat file based processing.

in general, though, the steps are pretty much the same no matter which authentication 
method you choose.  see

   http://www.modperlcookbook.org/code/ch13/Cookbook/Authenticate.pm

for an example - all you need to do is replace the authenticate_user() subroutine with 
calls that validate the user based on your own criteria.

HTH

--Geoff







Re: Reloading Library Files

2002-05-20 Thread Stas Bekman

Ted Prah wrote:

do you test only this script alone? What happens if you add the package
declaration and then call it using the full name? e.g.:

 
 
 Yes, this is the only script (and corresponding library file) that I use
 for this test.  When I use the package declaration and make the call
 using the full name, the reloads work fine.
 
 The reason I am using the library file is to follow your recommendation
 in the mod_perl guide where a script is split into two files to avoid
 the nested subroutine problem.  Out of curiosity I tested the sample
 code (counter.pl and mylib.pl) and they too did not reload properly
 when mylib.pl was modified.  Does the reloading of a modification
 of mylib.pl work for you?  I would prefer to use the library file
 approach as opposed to the package approach as a lot of our code
 uses libraries that are not in packages, but will move to packages if
 that is a necessity.

Well, that explains everything.

When you require() a library without that doesn't declare a package for 
the first time from the registry script, all its global symbols (subs, 
vars, etc) get imported into the namespace of the caller, i.e. the 
registry script (APACHE::ROOT::...).

When Apache::Reload require()s that library that doesn't declare a 
package, all the global symbols end up in the Apache::Reload namespace! 
So the library does get reloaded and you see the compile time errors if 
there are any, but the symbols don't get imported to the right 
namespace. So the old code is running. Moreover this leads to a 
pollution of the Apache::Reload namespace, which may cause to problems 
if you happen to overwrite some of its symbols (subs, vars, etc).

I suppose if you want to use the cheap workaround, you have to 
s/require/do/. Remember that the guide suggests the lib.pl trick as a 
workaround, not a solution you go with during the normal development.

Was this explanation clear enough? We need to add it to the 
Apache::Reload manpage to avoid this kind of questions in the future.

Funny though, that it's the first time this problem has been reported. 
Which shows that most of the people don't use workarounds when they do 
real developments :)

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com





Re: New mod_perl website [Was Re: Modifying @INC via startup.pl]

2002-05-20 Thread Stas Bekman

Drew Taylor wrote:
 This is a little OT, but I really love the new look of the website you 
 mention below. Major kudos to all those who helped put together the new 
 look-n-feel  content.

Thanks Drew, but please hold off on any comments, since we are still 
tuning the design to work better in various browsers. Once we are 
satisfied with it, we will make an announcement and then ask you to 
check if you have any problems with your favorite browsers.

Meanwhile if you are willing to help or want to comment on things, 
please join the [EMAIL PROTECTED] list.  We do need 
your help.

BTW, the final site will be at http://perl.apache.org/.

The http://perl.apache.org/release/ URL is only temporary.


__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




Re: Reloading Library Files

2002-05-20 Thread Ted Prah



Stas Bekman wrote:

 Ted Prah wrote:

 do you test only this script alone? What happens if you add the package
 declaration and then call it using the full name? e.g.:
 
 
 
  Yes, this is the only script (and corresponding library file) that I use
  for this test.  When I use the package declaration and make the call
  using the full name, the reloads work fine.
 
  The reason I am using the library file is to follow your recommendation
  in the mod_perl guide where a script is split into two files to avoid
  the nested subroutine problem.  Out of curiosity I tested the sample
  code (counter.pl and mylib.pl) and they too did not reload properly
  when mylib.pl was modified.  Does the reloading of a modification
  of mylib.pl work for you?  I would prefer to use the library file
  approach as opposed to the package approach as a lot of our code
  uses libraries that are not in packages, but will move to packages if
  that is a necessity.

 Well, that explains everything.

 When you require() a library without that doesn't declare a package for
 the first time from the registry script, all its global symbols (subs,
 vars, etc) get imported into the namespace of the caller, i.e. the
 registry script (APACHE::ROOT::...).

 When Apache::Reload require()s that library that doesn't declare a
 package, all the global symbols end up in the Apache::Reload namespace!
 So the library does get reloaded and you see the compile time errors if
 there are any, but the symbols don't get imported to the right
 namespace. So the old code is running. Moreover this leads to a
 pollution of the Apache::Reload namespace, which may cause to problems
 if you happen to overwrite some of its symbols (subs, vars, etc).


That explains the library files not reloading - Thanks!


 I suppose if you want to use the cheap workaround, you have to
 s/require/do/. Remember that the guide suggests the lib.pl trick as a
 workaround, not a solution you go with during the normal development.

I didn't realize that using the library wrapper was a cheap workaround
to the nested subroutine problem.  The effects of the nested subroutine
problem is my biggest concern with writing code for mod_perl.  What
I liked about the lib.pl trick is that it completely eliminates this problem.
I won't lose sleep wondering if somehow the code went into production
with a nested subroutine.  I realize that there are other ways to eliminate
the nested subroutine problem, but what are the preferred techniques
used by mod_perl developers?  Check the error_log file for problems
and fix as needed?



 Was this explanation clear enough? We need to add it to the
 Apache::Reload manpage to avoid this kind of questions in the future.


Makes sense.


 Funny though, that it's the first time this problem has been reported.
 Which shows that most of the people don't use workarounds when they do
 real developments :)



 __
 Stas BekmanJAm_pH -- Just Another mod_perl Hacker
 http://stason.org/ mod_perl Guide --- http://perl.apache.org
 mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
 http://modperlbook.org http://apache.org   http://ticketmaster.com

--
Ted Prah
NetCasters, Inc.
[EMAIL PROTECTED]





Re: Reloading Library Files

2002-05-20 Thread Stas Bekman

Ted Prah wrote:

 That explains the library files not reloading - Thanks!

Great!

I suppose if you want to use the cheap workaround, you have to
s/require/do/. Remember that the guide suggests the lib.pl trick as a
workaround, not a solution you go with during the normal development.
 
 
 I didn't realize that using the library wrapper was a cheap workaround
 to the nested subroutine problem.  The effects of the nested subroutine
 problem is my biggest concern with writing code for mod_perl.  What
 I liked about the lib.pl trick is that it completely eliminates this problem.
 I won't lose sleep wondering if somehow the code went into production
 with a nested subroutine.  I realize that there are other ways to eliminate
 the nested subroutine problem, but what are the preferred techniques
 used by mod_perl developers?  Check the error_log file for problems
 and fix as needed?

The majority are discussed here:
http://perl.apache.org/release/docs/general/perl_reference.html#Remedies_for_Inner_Subroutines

Declaring the package in your libs is a good idea. Because if you don't, 
most likely you are going to encounter this problem:
http://perl.apache.org/release/docs/1.0/guide/porting.html#Name_collisions_with_Modules_and_libs
The solutions are discussed there as well.

  Check the error_log file for problems and fix as needed?

Ted, this is not an option, this is a must thing to do. If you don't 
monitor constantly the error_log file while developing you are looking 
for a trouble.
http://perl.apache.org/release/docs/1.0/guide/debug.html#Warning_and_Errors_Explained


__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




undef Upload object

2002-05-20 Thread Mike Melillo


I am trying to have a user upload an image and I am getting an undef
$apr-upload object.

Here is the code:


form method=post ENCTYPE=multipart/form-data action=/join
input type=hidden name=action value=upload
br
your picture (size limit: 30k)br
input type=file name=userpic size=20
p
input type=submit name=submit value=ADD PROFILE
/form




sub upload {

my ($r) = shift;
my $apr = Apache::Request-new($r);
my $status = $apr-parse;
my $upload = $apr-upload; 
print STDERR Dumper($upload);
my $size = $upload-size;
if ($size  3) {
   #file is too big
   warn(File is too big\n);   
   return SERVER_ERROR;
}  

my $type = $upload-type;
if ($type ne 'image/jpeg') {
   # not a jpeg
   warn(Not a jpeg\n);
   return SERVER_ERROR;
}
my $fh = $upload-fh;
my $filename = $upload-filename;

 $apr-send_http_header( 'text/html' );
$apr-print(hello file\n);
return OK;

} # end of upload


The upload subroutine is still in the debugging stages hence the return
SERVER_ERROR and warn'ings.The Date::Dumper prints $VAR1 = undef;

Ideas?




Re: undef Upload object

2002-05-20 Thread Geoffrey Young



Mike Melillo wrote:

 I am trying to have a user upload an image and I am getting an undef
 $apr-upload object.
 


[snip]



 sub upload {
 
 my ($r) = shift;
 my $apr = Apache::Request-new($r);
 my $status = $apr-parse;


you might want to check the status of that upload here first

   return $status unless $status == OK;

and look into

   $apr-notes(error-notes);

if it fails.


 my $upload = $apr-upload; 
 print STDERR Dumper($upload);
 my $size = $upload-size;
 if ($size  3) {


you can handle that condition with the POST_MAX parameter in new().


other than that, nothing jumps out at me.

we have a working example that may be able to help you some:

   http://www.modperlcookbook.org/code/ch03/Cookbook/PrintUploads.pm

HTH

--Geoff




Re: undef Upload object

2002-05-20 Thread Stas Bekman

Geoffrey Young wrote:
 
 
 Mike Melillo wrote:
 
 I am trying to have a user upload an image and I am getting an undef
 $apr-upload object.

 we have a working example that may be able to help you some:
 
   http://www.modperlcookbook.org/code/ch03/Cookbook/PrintUploads.pm

also see the new addition contributed by Rich Bowen:
http://perl.apache.org/release/docs/1.0/guide/snippets.html#File_Upload_with_Apache__Request

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




Re: Setting require in Authentication handler?

2002-05-20 Thread Peter Bi

A remark: in many cases, the authentication against the password file can be
replaced by verifying valid FTP/Telnet login to localhost, not only
because the password (shadow) file is usually not avialble for Apache
account but also secure. In the ticketing system, the FTP/Telnet
authentication runs only at the first time of login and the follow-up access
can goes without re-FTP and so is pretty fast. Check this :
http://modperl.home.att.net


Peter Bi

- Original Message -
From: Geoffrey Young [EMAIL PROTECTED]
To: Todd Chapman [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Monday, May 20, 2002 6:50 AM
Subject: Re: Setting require in Authentication handler?




 Todd Chapman wrote:

  That makes sense. I can't use mod_auth because I can't set Require.


 well, if you're saying that you don't have the ability to set the Require
directive at all
 (as in you don't have access to edit httpd.conf), then you can't run any
authentication
 handler - mod_auth, mod_perl, or otherwise.  Apache core requires the
Require directive to
 be set to something before it will even try to run the authen/authz phases
of the request.

 so, you may be out of luck and need to resort to the CGI tricks of yore
where everything
 is clumped in the content-generation phase (and of which I'm not that
familiar).

  I'm
  using Basic authentication and text based password files. Unfortunately,
I
  can't find an Apache::Auth* module that handles basic authentication
  against text files. Did I miss it somewhere?


 I'm not sure, but it may not exist for the reason I stated eariler about
mod_perl not
 duplicating default Apache behavior.  IIRC, there is one that
authenticates against
 /etc/passwd, so maybe you can use that as an example of flat file based
processing.

 in general, though, the steps are pretty much the same no matter which
authentication
 method you choose.  see

http://www.modperlcookbook.org/code/ch13/Cookbook/Authenticate.pm

 for an example - all you need to do is replace the authenticate_user()
subroutine with
 calls that validate the user based on your own criteria.

 HTH

 --Geoff









Re: Memory Leaks

2002-05-20 Thread Gregory Matthews

I too thought of setting a cron job to restart the server once per day in 
order to keep the memory fresh.

In a production environment, are there any downsides to doing this, i.e., 
server inaccessibility, etc..?

Thanks.

Gregory

At 08:25 AM 5/20/2002 -0400, you wrote:

It is more an issue of it being worth tracking down a small memory
leak vs a large memory leak.  Our software still has some very small
leaks, on the order of 10kv every hour...  it would probably take us a
month to track down and solve all these problems.  I find it easier to
restart the web servers daily.

We did have some enourmous leaks as well, based on circular reference,
and those ate up 1 GB of memory in about 30 minutes...  It took us
about three weeks to find it.

Gregory Matthews writes:
  So am I being overly paranoid concerning the leak potential of mod_perl
  programming?
 
  If I start with strict code to begin with and try my best to stay away
  from the problems you mentioned, then any potential memory leak/drain
  issues will be avoided?
 
  Keep in mind, although my application is not designed to launch the space
  shuttle, I do want it to be solid/stable/peformance-packed from the 
 ground up.
 
  I will be also be using MySql with the Apache::DBI module.
 
  Thanks in advance.
 
  Gregory
 
 
  At 11:34 PM 5/19/2002 -0400, you wrote:
I have a couple of questions regarding leaking memory in mod_perl:
   
1.  What are the main culprits, in order of severity, of memory leaks,
  i.e.:
   
a.  global variables (NOT lexically scoped via my)
b.  ...
c.  ...
   
2.  When writing code from scratch (a new application), what is the
  best
way to avoid creating leaks to begin with, i.e., use strict;, PerlWarn
  On,
etc.. ?
  
  There are actually not very many ways you can leak memory in Perl (and
  thus mod_perl).  Most people confuse memory growth with memory leakage.
  If you want to know how to avoid memory growth, look at the performance
  tuning stuff in the Guide, like passing references, avoiding slurping of
  large files, controlling the buffering of DBI result sets, etc.
  
  Leaks are caused by circular references, the string form of eval (at
  least it used to leak a little), nested closures (sometimes created
  accidentally with the Error module), and one or two obscure syntax
  problems.  I think one of them involved code like my $x = 7 if $y;.
  Matt Sergeant got bitten by this in the early stages of AxKit
  development, and the details are in the mailing list archive.
  
  Global variables by themselves are not a source of leaks or growth.  If
  you slurp a large file into a global, your process will grow, but the
  same is true for a lexical.
  
  - Perrin
 
 

--
C Wayne Huling [EMAIL PROTECTED]





Re: Memory Leaks

2002-05-20 Thread Perrin Harkins

Gregory Matthews wrote:
 I too thought of setting a cron job to restart the server once per day 
 in order to keep the memory fresh.
 
 In a production environment, are there any downsides to doing this, 
 i.e., server inaccessibility, etc..?

There have been some discussion on the list about this in the past.  The 
ideal situation is to have a cluster that you can do a rolling restart 
on without making the service totally unavailable.

- Perrin




Monitoring the processes

2002-05-20 Thread Gregory Matthews

Thanks to everyone for the great input on Memory Leaks.  Now that I have a 
good starting point for tracking down the problem, when I TEST for leaks, 
or simply check for a continued increase in server memory usage, how do I 
go about monitoring the processes growth?

For example, is there a command line tool to use that will allow me to see 
the process growth upon request reload?  I know that I should run the 
server with httpd -X, but I don't know how to actually see the memory being 
used/increased/decreased when the prog is being executed. As I understand 
it, this is a good indication that there might be a problem.

Thanks in advance.

Gregory 





Re: [modperl2] Note on the win32 docs

2002-05-20 Thread Peter Rothermel

I've run into a problem with mod_perl configuration instructions
with for Registry scripts.  I've built mod_perl and copied the
blib directly under my Apache2 (server root) directory.

Here's the errors I get run I start apache:

C:\WGTI\Apache2\binapache
Using C:\WGTI\Apache2/blib
[Mon May 20 13:42:35 2002] [error] Attempt to free unreferenced scalar at C:\WGT
I\Apache2/blib/lib/Apache2/ModPerl/RegistryCooker.pm line 45.
BEGIN failed--compilation aborted at C:\WGTI\Apache2/blib/lib/Apache2/ModPerl/Re
gistryCooker.pm line 48.
Compilation failed in require at C:\WGTI\Apache2/blib/lib/Apache2/ModPerl/Regist
ry.pm line 11.
BEGIN failed--compilation aborted at C:\WGTI\Apache2/blib/lib/Apache2/ModPerl/Re
gistry.pm line 11.
Compilation failed in require at C:/WGTI/Apache2/conf/extra.pl line 15.
BEGIN failed--compilation aborted at C:/WGTI/Apache2/conf/extra.pl line 15.
Compilation failed in require at (eval 1) line 1.

[Mon May 20 13:42:35 2002] [error] Can't load Perl file: C:/WGTI/Apache2/conf/ex
tra.pl for server spider.inside.sealabs.com:80, exiting...


Here's snipet of my httpd.conf file:

   LoadModule perl_module modules/mod_perl.so

   PerlSwitches -Mblib=C:\WGTI\Apache2

   PerlModule Apache2
   PerlModule Apache::compat

   PerlRequire C:/WGTI/Apache2/conf/extra.pl


Here's my extra.pl

 use Apache2 ();
 use ModPerl::Util ();
 use Apache::RequestRec ();
 use Apache::RequestIO ();
 use Apache::RequestUtil ();
 use Apache::Server ();
 use Apache::ServerUtil ();
 use Apache::Connection ();
 use Apache::Log ();
 use Apache::Const -compile = ':common';
 use APR::Const -compile = ':common';
 use APR::Table ();
 use Apache::compat ();
 use ModPerl::Registry ();
 use CGI ();
1;




Re: Monitoring the processes

2002-05-20 Thread Per Einar Ellefsen

At 22:50 20.05.2002, Gregory Matthews wrote:
Thanks to everyone for the great input on Memory Leaks.  Now that I have a 
good starting point for tracking down the problem, when I TEST for leaks, 
or simply check for a continued increase in server memory usage, how do I 
go about monitoring the processes growth?

For example, is there a command line tool to use that will allow me to see 
the process growth upon request reload?  I know that I should run the 
server with httpd -X, but I don't know how to actually see the memory 
being used/increased/decreased when the prog is being executed. As I 
understand it, this is a good indication that there might be a problem.

What about using top(1) on Unix systems?


-- 
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: Memory Leaks

2002-05-20 Thread Gregory Matthews

Unfortunately, we only have one machine.  If we did employ the cron job as 
a clean-up utility once per day, wouldn't the potential impact of a site 
being unavailable only be for a few seconds (until Apache restarted)?

Gregory

At 05:12 PM 5/20/2002 -0400, you wrote:

Like another suggestion, we have a cluster of machines and roll the
restarts every hour.  Each machine is offset but 10 minutes.

Gregory Matthews writes:
  I too thought of setting a cron job to restart the server once per day in
  order to keep the memory fresh.
 
  In a production environment, are there any downsides to doing this, i.e.,
  server inaccessibility, etc..?
 
  Thanks.
 
  Gregory
 
  At 08:25 AM 5/20/2002 -0400, you wrote:
 
  It is more an issue of it being worth tracking down a small memory
  leak vs a large memory leak.  Our software still has some very small
  leaks, on the order of 10kv every hour...  it would probably take us a
  month to track down and solve all these problems.  I find it easier to
  restart the web servers daily.
  
  We did have some enourmous leaks as well, based on circular reference,
  and those ate up 1 GB of memory in about 30 minutes...  It took us
  about three weeks to find it.
  
  Gregory Matthews writes:
So am I being overly paranoid concerning the leak potential of 
 mod_perl
programming?
   
If I start with strict code to begin with and try my best to stay 
 away
from the problems you mentioned, then any potential memory leak/drain
issues will be avoided?
   
Keep in mind, although my application is not designed to launch the 
 space
shuttle, I do want it to be solid/stable/peformance-packed from the
   ground up.
   
I will be also be using MySql with the Apache::DBI module.
   
Thanks in advance.
   
Gregory
   
   
At 11:34 PM 5/19/2002 -0400, you wrote:
  I have a couple of questions regarding leaking memory in mod_perl:
 
  1.  What are the main culprits, in order of severity, of memory 
 leaks,
i.e.:
 
  a.  global variables (NOT lexically scoped via my)
  b.  ...
  c.  ...
 
  2.  When writing code from scratch (a new application), what is the
best
  way to avoid creating leaks to begin with, i.e., use strict;, 
 PerlWarn
On,
  etc.. ?

There are actually not very many ways you can leak memory in Perl (and
thus mod_perl).  Most people confuse memory growth with memory 
 leakage.
If you want to know how to avoid memory growth, look at the 
 performance
tuning stuff in the Guide, like passing references, avoiding 
 slurping of
large files, controlling the buffering of DBI result sets, etc.

Leaks are caused by circular references, the string form of eval (at
least it used to leak a little), nested closures (sometimes created
accidentally with the Error module), and one or two obscure syntax
problems.  I think one of them involved code like my $x = 7 if $y;.
Matt Sergeant got bitten by this in the early stages of AxKit
development, and the details are in the mailing list archive.

Global variables by themselves are not a source of leaks or 
 growth.  If
you slurp a large file into a global, your process will grow, but the
same is true for a lexical.

- Perrin
   
   
  
  --
  C Wayne Huling [EMAIL PROTECTED]
 
 

--
C Wayne Huling [EMAIL PROTECTED]





Re: problems on OS X

2002-05-20 Thread Doug MacEachern

On Sun, 28 Apr 2002, Ken Williams wrote:

 
 Insecure dependency in eval while running with -T switch.
 Callback called exit.
 

this has been fixed in modperl cvs, just remove the 'use 
ExtUtils::testlib;' line in t/docs/startup.pl





Re: Memory Leaks

2002-05-20 Thread Per Einar Ellefsen

At 23:23 20.05.2002, Gregory Matthews wrote:
Unfortunately, we only have one machine.  If we did employ the cron job as 
a clean-up utility once per day, wouldn't the potential impact of a site 
being unavailable only be for a few seconds (until Apache restarted)?

And if something goes wrong? You'd be having a server offline with noone 
knowing about it.

At 05:12 PM 5/20/2002 -0400, you wrote:

Like another suggestion, we have a cluster of machines and roll the
restarts every hour.  Each machine is offset but 10 minutes.

Gregory Matthews writes:
  I too thought of setting a cron job to restart the server once per day in
  order to keep the memory fresh.
 
  In a production environment, are there any downsides to doing this, i.e.,
  server inaccessibility, etc..?
 
  Thanks.
 
  Gregory
 
  At 08:25 AM 5/20/2002 -0400, you wrote:
 
  It is more an issue of it being worth tracking down a small memory
  leak vs a large memory leak.  Our software still has some very small
  leaks, on the order of 10kv every hour...  it would probably take us a
  month to track down and solve all these problems.  I find it easier to
  restart the web servers daily.
  
  We did have some enourmous leaks as well, based on circular reference,
  and those ate up 1 GB of memory in about 30 minutes...  It took us
  about three weeks to find it.
  
  Gregory Matthews writes:
So am I being overly paranoid concerning the leak potential of 
 mod_perl
programming?
   
If I start with strict code to begin with and try my best to 
 stay away
from the problems you mentioned, then any potential memory leak/drain
issues will be avoided?
   
Keep in mind, although my application is not designed to launch 
 the space
shuttle, I do want it to be solid/stable/peformance-packed from the
   ground up.
   
I will be also be using MySql with the Apache::DBI module.
   
Thanks in advance.
   
Gregory
   
   
At 11:34 PM 5/19/2002 -0400, you wrote:
  I have a couple of questions regarding leaking memory in mod_perl:
 
  1.  What are the main culprits, in order of severity, of 
 memory leaks,
i.e.:
 
  a.  global variables (NOT lexically scoped via my)
  b.  ...
  c.  ...
 
  2.  When writing code from scratch (a new application), what 
 is the
best
  way to avoid creating leaks to begin with, i.e., use strict;, 
 PerlWarn
On,
  etc.. ?

There are actually not very many ways you can leak memory in Perl 
 (and
thus mod_perl).  Most people confuse memory growth with memory 
 leakage.
If you want to know how to avoid memory growth, look at the 
 performance
tuning stuff in the Guide, like passing references, avoiding 
 slurping of
large files, controlling the buffering of DBI result sets, etc.

Leaks are caused by circular references, the string form of eval (at
least it used to leak a little), nested closures (sometimes created
accidentally with the Error module), and one or two obscure syntax
problems.  I think one of them involved code like my $x = 7 if $y;.
Matt Sergeant got bitten by this in the early stages of AxKit
development, and the details are in the mailing list archive.

Global variables by themselves are not a source of leaks or 
 growth.  If
you slurp a large file into a global, your process will grow, but the
same is true for a lexical.

- Perrin
   
   
  
  --
  C Wayne Huling [EMAIL PROTECTED]
 
 

--
C Wayne Huling [EMAIL PROTECTED]



-- 
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: Memory Leaks

2002-05-20 Thread Jason

If you don't want to restart the server then don't do this instead, it should help 
prevent small leaks from being a problem.
http://httpd.apache.org/docs-2.0/mod/mpm_common.html#maxrequestsperchild


- Original Message - 
From: Per Einar Ellefsen [EMAIL PROTECTED]
To: Gregory Matthews [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Monday, May 20, 2002 3:30 PM
Subject: Re: Memory Leaks


 At 23:23 20.05.2002, Gregory Matthews wrote:
 Unfortunately, we only have one machine.  If we did employ the cron job as 
 a clean-up utility once per day, wouldn't the potential impact of a site 
 being unavailable only be for a few seconds (until Apache restarted)?
 
 And if something goes wrong? You'd be having a server offline with noone 
 knowing about it.
 
 At 05:12 PM 5/20/2002 -0400, you wrote:
 
 Like another suggestion, we have a cluster of machines and roll the
 restarts every hour.  Each machine is offset but 10 minutes.
 
 Gregory Matthews writes:
   I too thought of setting a cron job to restart the server once per day in
   order to keep the memory fresh.
  
   In a production environment, are there any downsides to doing this, i.e.,
   server inaccessibility, etc..?
  
   Thanks.
  
   Gregory
  
   At 08:25 AM 5/20/2002 -0400, you wrote:
  
   It is more an issue of it being worth tracking down a small memory
   leak vs a large memory leak.  Our software still has some very small
   leaks, on the order of 10kv every hour...  it would probably take us a
   month to track down and solve all these problems.  I find it easier to
   restart the web servers daily.
   
   We did have some enourmous leaks as well, based on circular reference,
   and those ate up 1 GB of memory in about 30 minutes...  It took us
   about three weeks to find it.
   
   Gregory Matthews writes:
 So am I being overly paranoid concerning the leak potential of 
  mod_perl
 programming?

 If I start with strict code to begin with and try my best to 
  stay away
 from the problems you mentioned, then any potential memory leak/drain
 issues will be avoided?

 Keep in mind, although my application is not designed to launch 
  the space
 shuttle, I do want it to be solid/stable/peformance-packed from the
ground up.

 I will be also be using MySql with the Apache::DBI module.

 Thanks in advance.

 Gregory


 At 11:34 PM 5/19/2002 -0400, you wrote:
   I have a couple of questions regarding leaking memory in mod_perl:
  
   1.  What are the main culprits, in order of severity, of 
  memory leaks,
 i.e.:
  
   a.  global variables (NOT lexically scoped via my)
   b.  ...
   c.  ...
  
   2.  When writing code from scratch (a new application), what 
  is the
 best
   way to avoid creating leaks to begin with, i.e., use strict;, 
  PerlWarn
 On,
   etc.. ?
 
 There are actually not very many ways you can leak memory in Perl 
  (and
 thus mod_perl).  Most people confuse memory growth with memory 
  leakage.
 If you want to know how to avoid memory growth, look at the 
  performance
 tuning stuff in the Guide, like passing references, avoiding 
  slurping of
 large files, controlling the buffering of DBI result sets, etc.
 
 Leaks are caused by circular references, the string form of eval (at
 least it used to leak a little), nested closures (sometimes created
 accidentally with the Error module), and one or two obscure syntax
 problems.  I think one of them involved code like my $x = 7 if $y;.
 Matt Sergeant got bitten by this in the early stages of AxKit
 development, and the details are in the mailing list archive.
 
 Global variables by themselves are not a source of leaks or 
  growth.  If
 you slurp a large file into a global, your process will grow, but the
 same is true for a lexical.
 
 - Perrin


   
   --
   C Wayne Huling [EMAIL PROTECTED]
  
  
 
 --
 C Wayne Huling [EMAIL PROTECTED]
 
 
 
 -- 
 Per Einar Ellefsen
 [EMAIL PROTECTED]
 




Re: Memory Leaks

2002-05-20 Thread Matt Sergeant

On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote:
 I too thought of setting a cron job to restart the server once per day in
 order to keep the memory fresh.

 In a production environment, are there any downsides to doing this, i.e.,
 server inaccessibility, etc..?

It's very rare to have a site that can't cope with just a few seconds 
downtime. Most users won't even notice, save for some slight delay in getting 
their request through. Users tend to be pretty used to trying again in this 
world of reliable computing.

Matt.



Re: Memory Leaks

2002-05-20 Thread Allen Day

I've noticed that if I restart apache while I'm in the middle of a
download (MP3 stream), after the buffer in my MP3 player runs out, it
skips to the next track -- presumably because the connection was closed.

This might cause a problem for you if your users are downloading big
files.  They might have to restart from the beginning if they didn't cache
the partial download somewhere.

-Allen


On Mon, 20 May 2002, Matt Sergeant wrote:

 On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote:
  I too thought of setting a cron job to restart the server once per day in
  order to keep the memory fresh.
 
  In a production environment, are there any downsides to doing this, i.e.,
  server inaccessibility, etc..?

 It's very rare to have a site that can't cope with just a few seconds
 downtime. Most users won't even notice, save for some slight delay in getting
 their request through. Users tend to be pretty used to trying again in this
 world of reliable computing.

 Matt.





Re: Memory Leaks

2002-05-20 Thread Perrin Harkins

Per Einar Ellefsen wrote:
 And if something goes wrong? You'd be having a server offline with noone 
 knowing about it.

You can easilly set up mon (http://www.kernel.org/software/mon/) to page 
you if the server doesn't come back up within a certain amount of time.

- Perrin




Re: Memory Leaks

2002-05-20 Thread Per Einar Ellefsen

At 23:54 20.05.2002, Allen Day wrote:
I've noticed that if I restart apache while I'm in the middle of a
download (MP3 stream), after the buffer in my MP3 player runs out, it
skips to the next track -- presumably because the connection was closed.

This might cause a problem for you if your users are downloading big
files.  They might have to restart from the beginning if they didn't cache
the partial download somewhere.

Hmm, if you are serving big files off of mod_perl, memory leaks are the 
least of your problems :) That doesn't apply to Apache::MP3 of course, for 
which it's normal, but in no case should your mod_perl server be serving 
your big files.

On Mon, 20 May 2002, Matt Sergeant wrote:

  On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote:
   I too thought of setting a cron job to restart the server once per day in
   order to keep the memory fresh.
  
   In a production environment, are there any downsides to doing this, i.e.,
   server inaccessibility, etc..?
 
  It's very rare to have a site that can't cope with just a few seconds
  downtime. Most users won't even notice, save for some slight delay in 
 getting
  their request through. Users tend to be pretty used to trying again in this
  world of reliable computing.

-- 
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: Memory Leaks

2002-05-20 Thread Perrin Harkins

Jason wrote:
 If you don't want to restart the server then don't do this instead, it should 
help prevent small leaks from being a problem.
 http://httpd.apache.org/docs-2.0/mod/mpm_common.html#maxrequestsperchild

Apache::SizeLimit or Apache::GTopLimit is a better way to do it, since 
it results in fewer unnecessary restarts.  However, it's still a good 
idea to restart periodically, because some of the shared memory seems to 
become unshared over time no matter what you do, and restarting fixes that.

- Perrin




Re: Monitoring the processes

2002-05-20 Thread Perrin Harkins

Gregory Matthews wrote:
 For example, is there a command line tool to use that will allow me to 
 see the process growth upon request reload?  I know that I should run 
 the server with httpd -X, but I don't know how to actually see the 
 memory being used/increased/decreased when the prog is being executed. 
 As I understand it, this is a good indication that there might be a 
 problem.

You can steal some code from Apache::SizeLimit or Apache::GTopLimit to 
print the current size in the error log.  Otherwise, just use top.  Keep 
in mind that sometimes a leaky piece of code will have to be run several 
times before you see the leak, because of the way that Perl allocates 
memory in chunks.

- Perrin




Re: Memory Leaks

2002-05-20 Thread Allen Day

I mentioned the connection closing as a drawback of restarting the server
-- it was slightly OT for the thread.

Yes, it is a subclass of Apache::MP3 that can stream video and audio.
There is an old version called Apache::Jukebox in the Apache::MP3 CVS at
namp.sourceforge.net in case anyone is interested.

-Allen

On Mon, 20 May 2002, Per Einar Ellefsen wrote:

 At 23:54 20.05.2002, Allen Day wrote:
 I've noticed that if I restart apache while I'm in the middle of a
 download (MP3 stream), after the buffer in my MP3 player runs out, it
 skips to the next track -- presumably because the connection was closed.
 
 This might cause a problem for you if your users are downloading big
 files.  They might have to restart from the beginning if they didn't cache
 the partial download somewhere.

 Hmm, if you are serving big files off of mod_perl, memory leaks are the
 least of your problems :) That doesn't apply to Apache::MP3 of course, for
 which it's normal, but in no case should your mod_perl server be serving
 your big files.

 On Mon, 20 May 2002, Matt Sergeant wrote:
 
   On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote:
I too thought of setting a cron job to restart the server once per day in
order to keep the memory fresh.
   
In a production environment, are there any downsides to doing this, i.e.,
server inaccessibility, etc..?
  
   It's very rare to have a site that can't cope with just a few seconds
   downtime. Most users won't even notice, save for some slight delay in
  getting
   their request through. Users tend to be pretty used to trying again in this
   world of reliable computing.






Re: Memory Leaks

2002-05-20 Thread Doug MacEachern

On Mon, 20 May 2002, Perrin Harkins wrote:
 
 Apache::SizeLimit or Apache::GTopLimit is a better way to do it, since 
 it results in fewer unnecessary restarts.  However, it's still a good 
 idea to restart periodically, because some of the shared memory seems to 
 become unshared over time no matter what you do, and restarting fixes that.

that reminds me, i wrote a c version of Apache::GTopLimit 2+ years ago (at 
somepoint in the 5 months i worked at backflip.com), but never released 
it.   if somebody wants to maintain/release it, the package is here:
http://perl.apache.org/~dougm/mod_gtop-0.02.tar.gz




Re: How to configure mod_perl to get Connection.so, Connection.bsand so on...

2002-05-20 Thread Doug MacEachern

On Sat, 27 Apr 2002, sagar wrote:

 
 Hi
 I have installed apache-1.3.12, openssl-0.9.5a and apache-1.3.12+ssl-
 1.40 and configured mod_perl-1.26 on freeBSD 4.1 with apache by giving 
 the following:
 
 %perl Makefile.PL APACHE_SRC=../apache_1.3.12/src DO_HTTPD=1 
 USE_APACI=1 EVERYTHING=1 APACHE_PREFIX=/usr/local/apache
... 
 But, the following directories with their relevant files ( .so, .bs 
 files ) have not been created in the above path

modperl links those modules static so there won't be any .so's, unless you 
build with DYNAMIC=1





Re: [modperl2] Note on the win32 docs

2002-05-20 Thread Doug MacEachern

On Mon, 20 May 2002, Peter Rothermel wrote:

 I've run into a problem with mod_perl configuration instructions
 with for Registry scripts.  I've built mod_perl and copied the
 blib directly under my Apache2 (server root) directory.

sounds like a bug that has been fixed in cvs.  try the cvs version or wait 
for _02 or try the patch below.

Index: ModPerl-Registry/lib/ModPerl/RegistryCooker.pm
===
RCS file: /home/cvs/modperl-2.0/ModPerl-Registry/lib/ModPerl/RegistryCooker.pm,v
retrieving revision 1.5
retrieving revision 1.6
diff -u -r1.5 -r1.6
--- ModPerl-Registry/lib/ModPerl/RegistryCooker.pm  13 Nov 2001 04:34:31 - 
 1.5
+++ ModPerl-Registry/lib/ModPerl/RegistryCooker.pm  16 Apr 2002 17:14:16 - 
+ 1.6
 -42,10 +42,11 
 # httpd.conf with:
 #   PerlSetVar ModPerl::RegistryCooker::DEBUG 4
 use Apache::ServerUtil ();
-use constant DEBUG =
-defined Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
-? Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
-: D_NONE;
+use constant DEBUG = 0;
+#XXX: below currently crashes the server on win32
+#defined Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
+#? Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
+#: D_NONE;
 
 #
 # object's array index's access constants




Re: [modperl2] Note on the win32 docs

2002-05-20 Thread Peter Rothermel

Thanks for the info. Latest from cvs works fine.
Any idea how close _02 might be to release?

-pete

Doug MacEachern wrote:

 On Mon, 20 May 2002, Peter Rothermel wrote:

  I've run into a problem with mod_perl configuration instructions
  with for Registry scripts.  I've built mod_perl and copied the
  blib directly under my Apache2 (server root) directory.

 sounds like a bug that has been fixed in cvs.  try the cvs version or wait
 for _02 or try the patch below.

 Index: ModPerl-Registry/lib/ModPerl/RegistryCooker.pm
 ===
 RCS file: /home/cvs/modperl-2.0/ModPerl-Registry/lib/ModPerl/RegistryCooker.pm,v
 retrieving revision 1.5
 retrieving revision 1.6
 diff -u -r1.5 -r1.6
 --- ModPerl-Registry/lib/ModPerl/RegistryCooker.pm  13 Nov 2001 04:34:31 -   
   1.5
 +++ ModPerl-Registry/lib/ModPerl/RegistryCooker.pm  16 Apr 2002 17:14:16 -   
   1.6
  -42,10 +42,11 
  # httpd.conf with:
  #   PerlSetVar ModPerl::RegistryCooker::DEBUG 4
  use Apache::ServerUtil ();
 -use constant DEBUG =
 -defined Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
 -? Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
 -: D_NONE;
 +use constant DEBUG = 0;
 +#XXX: below currently crashes the server on win32
 +#defined Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
 +#? Apache-server-dir_config('ModPerl::RegistryCooker::DEBUG')
 +#: D_NONE;

  #
  # object's array index's access constants



Re: Memory Leaks

2002-05-20 Thread Issac Goldstand

I'd like to try to disagree here.  I have built several file-related 
webapps where I have implemented virtual filesystems which require 
special perl modules to access the files at all.  mod_perl takes care of 
serving the requests.  If I need a restart, then I can still safely use 
graceful.  Admittedly there are times when something could very well get 
screwed up, but my solution to that is to develop a better front-end 
server with it's own buffer so that the back-end can swiftly serve the 
files leaving much more idle time (in comparison to directly connecting 
remote client to fileserver) for  backend restarts if needed.

  Issac

Per Einar Ellefsen wrote:

 At 23:54 20.05.2002, Allen Day wrote:

 I've noticed that if I restart apache while I'm in the middle of a
 download (MP3 stream), after the buffer in my MP3 player runs out, it
 skips to the next track -- presumably because the connection was closed.

 This might cause a problem for you if your users are downloading big
 files.  They might have to restart from the beginning if they didn't 
 cache
 the partial download somewhere.


 Hmm, if you are serving big files off of mod_perl, memory leaks are 
 the least of your problems :) That doesn't apply to Apache::MP3 of 
 course, for which it's normal, but in no case should your mod_perl 
 server be serving your big files.

 On Mon, 20 May 2002, Matt Sergeant wrote:

  On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote:
   I too thought of setting a cron job to restart the server once 
 per day in
   order to keep the memory fresh.
  
   In a production environment, are there any downsides to doing 
 this, i.e.,
   server inaccessibility, etc..?
 
  It's very rare to have a site that can't cope with just a few seconds
  downtime. Most users won't even notice, save for some slight delay 
 in getting
  their request through. Users tend to be pretty used to trying again 
 in this
  world of reliable computing.








Re: Memory Leaks

2002-05-20 Thread Per Einar Ellefsen

At 00:45 21.05.2002, Issac Goldstand wrote:
I'd like to try to disagree here.  I have built several file-related 
webapps where I have implemented virtual filesystems which require special 
perl modules to access the files at all.  mod_perl takes care of serving 
the requests.  If I need a restart, then I can still safely use 
graceful.  Admittedly there are times when something could very well get 
screwed up, but my solution to that is to develop a better front-end 
server with it's own buffer so that the back-end can swiftly serve the 
files leaving much more idle time (in comparison to directly connecting 
remote client to fileserver) for  backend restarts if needed.

In the case that you need advanced logic such as that, I clearly agree with 
both you and Allen. And a proxy server is very needed in such a case :)

Per Einar Ellefsen wrote:

At 23:54 20.05.2002, Allen Day wrote:

I've noticed that if I restart apache while I'm in the middle of a
download (MP3 stream), after the buffer in my MP3 player runs out, it
skips to the next track -- presumably because the connection was closed.

This might cause a problem for you if your users are downloading big
files.  They might have to restart from the beginning if they didn't cache
the partial download somewhere.


Hmm, if you are serving big files off of mod_perl, memory leaks are the 
least of your problems :) That doesn't apply to Apache::MP3 of course, 
for which it's normal, but in no case should your mod_perl server be 
serving your big files.

On Mon, 20 May 2002, Matt Sergeant wrote:

  On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote:
   I too thought of setting a cron job to restart the server once per 
 day in
   order to keep the memory fresh.
  
   In a production environment, are there any downsides to doing this, 
 i.e.,
   server inaccessibility, etc..?
 
  It's very rare to have a site that can't cope with just a few seconds
  downtime. Most users won't even notice, save for some slight delay in 
 getting
  their request through. Users tend to be pretty used to trying again 
 in this
  world of reliable computing.

--
Per Einar Ellefsen
[EMAIL PROTECTED]





Re: Help with Method Handlers in mod_perl 1.99

2002-05-20 Thread Doug MacEachern

On Fri, 3 May 2002, Peter Rothermel wrote:

 I tried the mehod attribute and now I get this error:
 
 Error message:
Can't locate object method  via package Apache::AuthDerivedHandler.

method handlers were broken in _01, this has been fixed in cvs and will be 
in 1.99_02





RE: mod_perl2: nmake test crashes apache

2002-05-20 Thread Doug MacEachern

On Tue, 14 May 2002, Alessandro Forghieri wrote:

 ii) It does however crash on my testbed app (which runs as standard CGI,
 FastCGI and
 moperl1). The crash itself appears to happen when a number of
 nearly-simultaneous requests
 arrive to the server and is fatal to modperl (but the static-serving part of
 apache appears to survive)

do you have a simple test case to reproduce the problem?
 
 iib) I then set out to build a debug version. That ain't easy I finally

this has been fixed in cvs, MP_DEBUG=1 should do the right things now.





Re: make test problem

2002-05-20 Thread Doug MacEachern

On Mon, 20 May 2002, Jie Gao wrote:
 
 Just got one from cvs and 'make test' hangs on apr/util:
... 
 apr/util

likely the call to APR::generate_random_bytes, could be blocking on 
/dev/random or similar (strace would tell you).  i've disabled the test 
in cvs for the moment, as i've seen problems with it in the past on other 
platforms (hpux).




Re: Seg fault on apache start

2002-05-20 Thread Doug MacEachern

On Sat, 18 May 2002, Jaberwocky wrote:

 I'm having some problems with this. Apache seg faults on the call to parse...
..
 #1  0x80c5ad8 in XML_GetBuffer ()

did you build apache with --disable-rule=EXPAT ?





Re: problems on OS X

2002-05-20 Thread Ken Williams

Great, the CVS version passes all tests for me now when built 
under 'perl Makefile.PL EVERYTHING=1' using apache 1.3.24.

On Tuesday, May 21, 2002, at 07:19  AM, Doug MacEachern wrote:
 On Sun, 28 Apr 2002, Ken Williams wrote:

 
 Insecure dependency in eval while running with -T switch.
 Callback called exit.
 

 this has been fixed in modperl cvs, just remove the 'use
 ExtUtils::testlib;' line in t/docs/startup.pl




  -Ken




Apache::GTopLimit

2002-05-20 Thread Gregory Matthews

Does using the Apache::GTopLimit module have the same net effect as 
restarting the server itself by simply killing off the actual processes 
which are growing beyond the set threshold, and thereby causing new 
processes to be born?

If so, this sounds like a good alternative to setting a cron task or 
manually restarting the server each day/week.

Gregory





Re: [modperl2] Note on the win32 docs

2002-05-20 Thread Doug MacEachern

On Mon, 20 May 2002, Peter Rothermel wrote:

 Thanks for the info. Latest from cvs works fine.
 Any idea how close _02 might be to release?

hopefully in a day or three.





Re: Apache::GTopLimit

2002-05-20 Thread Perrin Harkins

 Does using the Apache::GTopLimit module have the same net effect as
 restarting the server itself by simply killing off the actual
processes
 which are growing beyond the set threshold, and thereby causing new
 processes to be born?

It does kill off processes that are getting too big, and you definitely
should use either GtopLimit or SizeLimit to get the most out of your
server.  However, it's not quite the same thing as a restart.  Over
time, some of the shared memory from the parent process appears to
become unshared, and new processes that are spawned start out with less
shared memory because of this.  Restarting now and then takes care of
this problem.

- Perrin




Re: Apache::GTopLimit

2002-05-20 Thread Gregory Matthews

So to modify my previous question, other than the loss of some shared 
memory over time,  GTopLimit will have the same effect as restarting the 
server?

On a side note, how are you tracking/discovering the this minimal loss over 
time?

Gregory


At 08:38 PM 5/20/2002 -0400, you wrote:
  Does using the Apache::GTopLimit module have the same net effect as
  restarting the server itself by simply killing off the actual
processes
  which are growing beyond the set threshold, and thereby causing new
  processes to be born?

It does kill off processes that are getting too big, and you definitely
should use either GtopLimit or SizeLimit to get the most out of your
server.  However, it's not quite the same thing as a restart.  Over
time, some of the shared memory from the parent process appears to
become unshared, and new processes that are spawned start out with less
shared memory because of this.  Restarting now and then takes care of
this problem.

- Perrin





Re: mod_perl 2.0 - writing a proxy handler

2002-05-20 Thread Doug MacEachern

On Tue, 14 May 2002, Douglas Younger wrote:

 Hello,
Has anyone written a proxy handler in 2.0 similar to example 7-12 of the 
 O`Reilly book? I've tried converting it without much luck. I don't need the 
 add-blocker stuff, just a generic proxy handle that I can add some 
 additional lines to parse the output.

you'll need modperl from cvs (or wait for _02) for $r-proxyreq to 
auto-detect a proxy request.  with modperl-cvs and Apache::compat loaded, 
i have run Apache::AdBlocker without any modperl api changes.  however, i 
did need the patch below because my GD install does have gif support.

--- lib/Apache/AdBlocker.pm~Fri Mar  3 21:08:35 2000
+++ lib/Apache/AdBlocker.pm Mon May 20 17:31:22 2002
 -61,7 +61,7 
 my $content = \$response-content;
 if($r-content_type =~ /^image/ and $r-uri =~ /\b($Ad)\b/i) {
block_ad($content);
-   $r-content_type(image/gif);
+   $r-content_type(image/png);
 }
 
 $r-content_type('text/html') unless $$content;
 -85,7 +85,7 
 $im-string(GD::gdLargeFont(),5,5,Blocked Ad,$red);
 $im-rectangle(0,0,$x-1,$y-1,$black);
 
-$$data = $im-gif;
+$$data = $im-png;
 }
 
 1;




Re: Apache::GTopLimit

2002-05-20 Thread Perrin Harkins

 So to modify my previous question, other than the loss of some shared
 memory over time,  GTopLimit will have the same effect as restarting
the
 server?

Yes.  That shared memory is important though.

 On a side note, how are you tracking/discovering the this minimal loss
over
 time?

Apache::SizeLimit prints statistics to the error log when it kills a
process.  Also, you can just look at top.

- Perrin




Re: compatibility problem

2002-05-20 Thread Doug MacEachern

On Fri, 17 May 2002, Jie Gao wrote:

 use Apache::Constants qw(:common :response M_GET M_POST AUTH_REQUIRED REDIRECT);

the :response group in 1.x consists of names which apache has deprecated 
in 1.3.x and removed in 2.0, for which there are HTTP_* names that replace 
the old names.  so for example, if you had imported the :response group to 
use 'BAD_GATEWAY', you should instead explicity import HTTP_BAD_GATEWAY, 
which will work with both 1.x and 2.x.

 If I take out response, it croaks at REDIRECT.

i've added REDIRECT to the list of shortcut names which apache had 
deprecated, but are common/handy enough to carry forward with modperl.
the full list of shortcut names supported in modperl2 that are deprecated 
in apache (in favor of the long-winded HTTP_ names):

NOT_FOUND(HTTP_NOT_FOUND)
FORBIDDEN(HTTP_FORBIDDEN)
AUTH_REQUIRED(HTTP_UNAUTHORIZED)
SERVER_ERROR (HTTP_INTERNAL_SERVER_ERROR)
REDIRECT (HTTP_MOVED_TEMPORARILY)





Re: Memory Leaks

2002-05-20 Thread Stas Bekman

Per Einar Ellefsen wrote:
 At 23:54 20.05.2002, Allen Day wrote:
 
 I've noticed that if I restart apache while I'm in the middle of a
 download (MP3 stream), after the buffer in my MP3 player runs out, it
 skips to the next track -- presumably because the connection was closed.

 This might cause a problem for you if your users are downloading big
 files.  They might have to restart from the beginning if they didn't 
 cache
 the partial download somewhere.
 
 
 Hmm, if you are serving big files off of mod_perl, memory leaks are the 
 least of your problems :) 

Well, you can serve big files without reading them into a memory at 
once. Why there would be memory leaks?

 That doesn't apply to Apache::MP3 of course, 
 for which it's normal, but in no case should your mod_perl server be 
 serving your big files.

The reason for not serving big files with mod_perl, is that you don't 
want to tie heavy and snappy mod_perl servers to wait indefinitely for 
the client to grab their data. If you have plenty of memory or you have 
just a few clients (intranet?) that's just fine. This is all discussed here:
http://perl.apache.org/release/docs/1.0/guide/strategy.html#Adding_a_Proxy_Server_in_http_Accelerator_Mode


__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




Re: Memory Leaks

2002-05-20 Thread Stas Bekman

Gregory Matthews wrote:
 Does using the Apache::GTopLimit module have the same net effect as 
 restarting the server itself by simply killing off the actual processes 
 which are growing beyond the set threshold, and thereby causing new 
 processes to be born?

It's not the exactly the same, since it won't pick up any changes  in 
Perl modules on the disk. And that's one of the main reasons for doing 
restarts. Otherwise yes.


__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




Re: Monitoring the processes

2002-05-20 Thread Stas Bekman

Gregory Matthews wrote:
 Thanks to everyone for the great input on Memory Leaks.  Now that I have 
 a good starting point for tracking down the problem, when I TEST for 
 leaks, or simply check for a continued increase in server memory usage, 
 how do I go about monitoring the processes growth?
 
 For example, is there a command line tool to use that will allow me to 
 see the process growth upon request reload?  I know that I should run 
 the server with httpd -X, but I don't know how to actually see the 
 memory being used/increased/decreased when the prog is being executed. 
 As I understand it, this is a good indication that there might be a 
 problem.

Apache::VMonitor is great! (well, I wrote it :)

Gregory, before you continue asking more questions... it's all the guide:
http://perl.apache.org/release/docs/1.0/guide/performance.html#Measuring_the_Memory_of_the_Process

so before you ask, check the guide. Use the search if you don't know 
where to look.
__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




Re: Apache::GTopLimit

2002-05-20 Thread Stas Bekman

Perrin Harkins wrote:
Does using the Apache::GTopLimit module have the same net effect as
restarting the server itself by simply killing off the actual
 
 processes
 
which are growing beyond the set threshold, and thereby causing new
processes to be born?
 
 
 It does kill off processes that are getting too big, and you definitely
 should use either GtopLimit or SizeLimit to get the most out of your
 server.  However, it's not quite the same thing as a restart.  Over
 time, some of the shared memory from the parent process appears to
 become unshared, and new processes that are spawned start out with less
 shared memory because of this.  

Hmm, when a new process starts it shares *everything* with the parent. 
Why do you say that it's not?

It doesn't matter if the process gets killed because of 
MaxRequestPerChild or FooLimit thresholds.

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com




Re: Monitoring the processes

2002-05-20 Thread Gregory Matthews

Sorry for being lazy! I will read the guide all the way through...promise!

Thanks though for everyone's help up to this point!

Gregory

At 12:04 PM 5/21/2002 +0800, you wrote:
Gregory Matthews wrote:
Thanks to everyone for the great input on Memory Leaks.  Now that I have 
a good starting point for tracking down the problem, when I TEST for 
leaks, or simply check for a continued increase in server memory usage, 
how do I go about monitoring the processes growth?
For example, is there a command line tool to use that will allow me to 
see the process growth upon request reload?  I know that I should run the 
server with httpd -X, but I don't know how to actually see the memory 
being used/increased/decreased when the prog is being executed. As I 
understand it, this is a good indication that there might be a problem.

Apache::VMonitor is great! (well, I wrote it :)

Gregory, before you continue asking more questions... it's all the guide:
http://perl.apache.org/release/docs/1.0/guide/performance.html#Measuring_the_Memory_of_the_Process

so before you ask, check the guide. Use the search if you don't know where 
to look.
__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com