Re: mod_perl security on a shared web server

2000-09-06 Thread Matt Sergeant

On Wed, 6 Sep 2000, Félix C.Courtemanche wrote:

 Hello,
 
 I couldn't find any occurance of this question in the archives, but if it
 does exists, please forward me to it.
 
 I have been working on a set of Administration Tools for commercial web
 hosting companies for quite some times.  Lately I have been trying to figure
 out the MOST secure way to host multiple accounts on the same server, with
 mod_perl enabled AS FAST AS POSSIBLE.
 
 In the best world, I would have the possibility of:
 - Restricting the opened files by any .pl script to the user's base
 directory.
 - Allowing custom shell commands or not
 - Setting a maximum execution time for a script
 
 The first directive would be used to prevent anyone from reading the source
 of another program, wich would allow someone to grab the sensitive data
 stored in configuration files, such as Database Passwords, etc.  It is the
 MOST important of all and I really must find a solution.  I previously saw
 some perl wrapper that would only allow files owned by the script's owner to
 be read.  However, that wrapper greatly reduced the execution speed of .pl
 and it was not that effective.  Any suggestions?

The _only_ way I see you being able to do this securely is to use a Safe
compartment with a Safe::Hole through to your custom open() function which
does all the checking.

The problem then becomes enabling something like DBI support. You'd need
to provide a safe hole through to DBI (not sure if you'd have to write a
wrapper or what - never tried it personally). And then the same goes for
something like CGI.pm, probably.

The other stuff can be done with the resource limiting modules.

If you come up with something it would be great if you could share it. I
started working on something like it a while back (even had an
Apache::SafeRegistry module built, but it didn't work because Safe::Hole
didn't exist back then).

-- 
Matt/

Fastnet Software Ltd. High Performance Web Specialists
Providing mod_perl, XML, Sybase and Oracle solutions
Email for training and consultancy availability.
http://sergeant.org | AxKit: http://axkit.org




Re: mod_perl security on a shared web server

2000-09-06 Thread Jonathan Leto


I would suggest www.freevsd.org, because what you need is complete
compartmentalization. 



 F?lix C.Courtemanche ([EMAIL PROTECTED]) was saying:

 Hello,
 
 I couldn't find any occurance of this question in the archives, but if it
 does exists, please forward me to it.
 
 I have been working on a set of Administration Tools for commercial web
 hosting companies for quite some times.  Lately I have been trying to figure
 out the MOST secure way to host multiple accounts on the same server, with
 mod_perl enabled AS FAST AS POSSIBLE.
 
 In the best world, I would have the possibility of:
 - Restricting the opened files by any .pl script to the user's base
 directory.
 - Allowing custom shell commands or not
 - Setting a maximum execution time for a script
 
 The first directive would be used to prevent anyone from reading the source
 of another program, wich would allow someone to grab the sensitive data
 stored in configuration files, such as Database Passwords, etc.  It is the
 MOST important of all and I really must find a solution.  I previously saw
 some perl wrapper that would only allow files owned by the script's owner to
 be read.  However, that wrapper greatly reduced the execution speed of .pl
 and it was not that effective.  Any suggestions?
 
 The second directive would allow me to specify wether or not a user can run
 commands that would be passed as shell OR specify what paths are available
 (only /usr/bin for example)
 
 Finally, the third directive would allow me to kill any script running for
 too long or using too much CPU.
 
 I understand that there is probably no tool to do all of it, but if I can
 gather the tools to make it as effective as possible, it would be really
 usefull for me and others.
 
 Please don't tell me to monitor the user's scripts, since that is almost
 impossible to do when you have more than 10 sites to monitor, wich will
 happen quickly :)
 
 Any other tips and tricks to improve the security of mod_perl is greatly
 appreciated as well.
 
 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
 Félix C.Courtemanche . Head Designer
 Co-Administrator . Can-Host Networks
 http://www.can-host.com
 [EMAIL PROTECTED]
 
 

-- 
[EMAIL PROTECTED] 
"With pain comes clarity."





calling C extensions to Apache through mod_perl

2000-09-06 Thread Dave DeMaagd

Here's what I'm looking at: 

- Writing mod_foo as a dso for apache
- Have perl running under mod_perl
- Want to make calls to to functions in mod_foo from the perl scripts

I know how to do the first two, but I'm not getting anywhere on where
to get the third.  Am I just missing something really obvious.  

Thanks!

-- 
Dave DeMaagd - [EMAIL PROTECTED]  - http://www.spinynorm.net
Beware THE MERCILESS PEPPERS OF QUETZLZACATENANGO!



Re: calling C extensions to Apache through mod_perl

2000-09-06 Thread Matt Sergeant

On Wed, 6 Sep 2000, Dave DeMaagd wrote:

 Here's what I'm looking at: 
 
 - Writing mod_foo as a dso for apache
 - Have perl running under mod_perl
 - Want to make calls to to functions in mod_foo from the perl scripts
 
 I know how to do the first two, but I'm not getting anywhere on where
 to get the third.  Am I just missing something really obvious.  

You could either use notes as a message passing mechanism, or write some
XS code and call the functions directly.

-- 
Matt/

Fastnet Software Ltd. High Performance Web Specialists
Providing mod_perl, XML, Sybase and Oracle solutions
Email for training and consultancy availability.
http://sergeant.org | AxKit: http://axkit.org




Bug? in mod_perl when POST request yields REDIRECT

2000-09-06 Thread Reif Peter

I am using a self written mod_perl module that does proxy requests. It acts
as content handler and fetches the requestet documents via LWP::UserAgent.
The program works fine but when the request is a POST request and the
response is a redirection (301, 302, ...) with a Location: header, no data
is sent to the browser.

If I don't read the postet data, everything works. So my suspicion is the
following:
For any reason, if the module returns a redirecting result code (301, 302,
...), mod_perl tries to read again the posted data and waits forever.

My solution is simple: Just set the Content-lengt: header to undef:

$r-header_in('Content-length' = undef);

Is this a bug or a feature?

I include my module and the part of my server config.

Peter

Server config:
==

Directory proxy:

SetHandler perl-script
PerlHandler Apache::Proxy_test

/Directory

Proxy_test.pm:
==


package Apache::Proxy_test;

use strict;

use Apache::Constants qw(:response :methods :http);
use Apache::File ();
use Apache::Log ();
use Apache::ModuleConfig ();
use Apache::Table;
use Apache::URI ();

use LWP::UserAgent ();

my $UA = LWP::UserAgent-new;

sub handler {
my $r = shift;

my $not_modified = $r-meets_conditions == HTTP_NOT_MODIFIED;

#
# create request
#
my $filename = $r-filename();
$filename =~ s/^proxy://;
my $parsed_uri = $r-parsed_uri;
my $query = $parsed_uri-query;
$filename .= "?$query" if $query;

$r-log-debug ("filename: $filename");

my $request = HTTP::Request-new($r-method, $filename);
$UA-agent ($r-header_in ('User-Agent'));

# copy POST data, if any
if($r-method eq 'POST') {
my $len = $r-header_in('Content-length');
my $buf;
my $ret = read(STDIN, $buf, $len);
$request-content($buf);
# next line prevents bug !!!
$r-header_in('Content-length' = undef);
}

$r-log-debug ("subrequest:\n\n", $request-as_string);

#
# evaluate response
#

my $response = $UA-simple_request($request);

if ($response-code != 200) {
$r-log-debug ("response not OK:\n\n",
$response-as_string);
$response-scan(sub {
my ($header, $value) = @_;

$r-log-debug ("Header-out: $header $value");
$r-header_out($header, $value);
});
} else {
$r-content_type($response-header('Content-type'));
$r-status($response-code);
$r-status_line(join " ", $response-code,
$response-message);
$r-send_http_header();
unless ($r-header_only) {
print $response-content;
}
}

$r-log-debug("send:\n\n", $r-as_string);
$r-log-debug("return ", $response-code);
return $response-code;
}


1;

__END__



reload

2000-09-06 Thread test




Hi

I thought Reload would be quite helpfull because of not having to restart
the server, but now I am quite stuck. Only the first form shows all the
changes immediately. The second form after pushing a button does not.

Is this explainable?

Arnold





perl.conf starting:
PerlRequire conf/startup.pl
PerlFreshRestartOn
PerlPostReadRequestHandler 'sub { Apache-request(shift) } '

PerlTransHandlerApache::StripSession
PerlInitHandler Apache::Reload
PerlSetVar  ReloadAll off



package Myapp.pm
use strict;
use Apache::Constants qw(:common);
use Apache::File ();
use CGI qw(:standard :html3 :netscape);
use Image::Magick ();

use DBI ();

use Apache::Reload;


handler {

CASE: {
/^/i and do some_other_form()

first_form();
}

}

sub first_form() {
This form does show changes for all children.

Push a submit button to go some_other_form
}

sub some_other_form() {
This form does not show changes at all..


}




[REQ] Looking for mod_perl based Counter

2000-09-06 Thread heddy Boubaker


 hi, 
 
 Before reiventing the wheel I'm looking for a mod_perl based access Counter,
 the idea is to use it with SSI served files, a little bit as mod_cntr work.
 
 regards
 
-- 
 /   From the last station before the end of the neT   \
 \ Centre d'Etudes de la Navigation Aerienne,  div SSS /
 / 7, av edouard Belin - 31055 Toulouse CEDEX - France \
 \ Tel:(+33|0)5.62.25.95.22 | Fax:(+33|0)5.62.25.95.99 /



[ANN] Alpha v of Apache::NNTPGateway

2000-09-06 Thread heddy Boubaker


 Apache::NNTPGateway is a simple gateway between an Apache http server
 (mod_perl enabled) and a NNTP server, allowing user to read, post, followup
 ... articles to newsgroups trough a simple web browser.
 
 Current v is 0.6 and it is the first public release, but intended for
 debugging help only ... that is if you're ready to manage with it's current
 imperfections and help debugging them and eventually submit patches or
 anything helpfull, this is it! If not you'll have to wait for any further
 release, sorry ;-(
 
 For now there is no standard Perl module installation procedure, only the raw
 .pm file: 
 
 http://www.tls.cena.fr/~boubaker/distrib/NNTPGateway-0.6.pm
 
 sincerely
 
-- 
 /   From the last station before the end of the neT   \
 \ Centre d'Etudes de la Navigation Aerienne,  div SSS /
 / 7, av edouard Belin - 31055 Toulouse CEDEX - France \
 \ Tel:(+33|0)5.62.25.95.22 | Fax:(+33|0)5.62.25.95.99 /



Using plugins in modules.

2000-09-06 Thread Alexei V. Alexandrov

Hello modperl,

  There is a module i have written for site managment and i would like to extend
  it with support for plugins and i don`t know where to start from. I would like
  it  uppon  startup scan a directory for plugins and then import functions from
  found  plugins  and  also  provide  a list af available modules when calling a
  specific  function.  Can  anyone  help me to solve this problem or to point me
  somewhere where i can read about it.

Best regards,
Alexei V. Alexandrov   [AA4460, AVA32-RIPN, AA1829-RIPE]

*** Alexei V. Alexandrov -- www.elcomsoft.com  [EMAIL PROTECTED] ***
*** PGP Fingerprint:9F23 7153 51D4 FD8F  4E7F D4DF E0FA E400 ***






ErrorDocument problem

2000-09-06 Thread BeerBong

Hello all!

I have a two apache server model (front-end - simple, back-end - power)

I want return custom page on 404 status.
.asp, .pl files are passed to back-end via mod_rewrite on simple Apache (I'm
using Apache::ASP).

When I trying to access
1. /not_existing_file - works cgi script on simple, works fine
2. /not_existing_file.asp - I get standart not found message, generated by
browser!
although
3. /cgi-bin/error.pl - returns normal output generated by power apache.

It seems that ErrorDocument for power Apache doesnt work...
How I can fix this problem ? :(


Part of httpd.conf
---
IfDefine simple
  ScriptAlias /cgi-bin/ /usr/web/cgi-bin/
  Location /cgi-bin
SetHandler cgi-script
  /Location
   ErrorDocument 404 /cgi-bin/error.cgi
/IfDefine
IfDefine power
  Alias /cgi-bin /usr/web/cgi-bin
  Location /cgi-bin
SetHandler perl-script
PerlHandler Apache::Registry
Options ExecCGI
PerlSendHeader On
  /Location
  ErrorDocument 404 /cgi-bin/error.pl (error.pl is symbolic link to
error.cgi)
/IfDefine
---



--
Sergey Polyakov - chief of WebZavod
http://www.webzavod.ru




Re: ErrorDocument problem

2000-09-06 Thread Nouguier

BeerBong wrote:

 Hello all!

 I have a two apache server model (front-end - simple, back-end - power)

 I want return custom page on 404 status.
 .asp, .pl files are passed to back-end via mod_rewrite on simple Apache (I'm
 using Apache::ASP).

 When I trying to access
 1. /not_existing_file - works cgi script on simple, works fine
 2. /not_existing_file.asp - I get standart not found message, generated by
 browser!
 although
 3. /cgi-bin/error.pl - returns normal output generated by power apache.

 It seems that ErrorDocument for power Apache doesnt work...
 How I can fix this problem ? :(

 Part of httpd.conf
 ---
 IfDefine simple
   ScriptAlias /cgi-bin/ /usr/web/cgi-bin/
   Location /cgi-bin
 SetHandler cgi-script
   /Location
ErrorDocument 404 /cgi-bin/error.cgi
 /IfDefine
 IfDefine power
   Alias /cgi-bin /usr/web/cgi-bin
   Location /cgi-bin
 SetHandler perl-script
 PerlHandler Apache::Registry
 Options ExecCGI
 PerlSendHeader On
   /Location
   ErrorDocument 404 /cgi-bin/error.pl (error.pl is symbolic link to
 error.cgi)
 /IfDefine
 ---

 --
 Sergey Polyakov - chief of WebZavod
 http://www.webzavod.ru

Hi sergey,
Perhaps you should add FollowSymLinks to your Options...





RE: ErrorDocument problem

2000-09-06 Thread BeerBong



 -Original Message-
 From: BeerBong [mailto:[EMAIL PROTECTED]]
 Sent: Wednesday, September 06, 2000 5:37 PM
 To: Nouguier
 Subject: RE: ErrorDocument problem
 
 
 Nope, I tried to copy error.cgi to error.pl - result the same :(
 --
 Sergey Polyakov - chief of WebZavod
 http://www.webzavod.ru
 
  -Original Message-
  From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On 
  Behalf Of Nouguier
  Sent: Wednesday, September 06, 2000 5:28 PM
  To: BeerBong
  Cc: ModPerl
  Subject: Re: ErrorDocument problem
  
  
  BeerBong wrote:
  
   Hello all!
  
   I have a two apache server model (front-end - simple, 
 back-end - power)
  
   I want return custom page on 404 status.
   .asp, .pl files are passed to back-end via mod_rewrite on 
  simple Apache (I'm
   using Apache::ASP).
  
   When I trying to access
   1. /not_existing_file - works cgi script on simple, works fine
   2. /not_existing_file.asp - I get standart not found message, 
  generated by
   browser!
   although
   3. /cgi-bin/error.pl - returns normal output generated by 
 power apache.
  
   It seems that ErrorDocument for power Apache doesnt work...
   How I can fix this problem ? :(
  
   Part of httpd.conf
   ---
   IfDefine simple
 ScriptAlias /cgi-bin/ /usr/web/cgi-bin/
 Location /cgi-bin
   SetHandler cgi-script
 /Location
  ErrorDocument 404 /cgi-bin/error.cgi
   /IfDefine
   IfDefine power
 Alias /cgi-bin /usr/web/cgi-bin
 Location /cgi-bin
   SetHandler perl-script
   PerlHandler Apache::Registry
   Options ExecCGI
   PerlSendHeader On
 /Location
 ErrorDocument 404 /cgi-bin/error.pl (error.pl is symbolic link to
   error.cgi)
   /IfDefine
   ---
  
   --
   Sergey Polyakov - chief of WebZavod
   http://www.webzavod.ru
  
  Hi sergey,
  Perhaps you should add FollowSymLinks to your Options...
  
  
  



upgrading mod_perl on production machine

2000-09-06 Thread Bill Moseley

Hi,

Some basic questions here:

I hope I didn't miss anything in the Guide at install.html and in
control.html, but I was looking for any suggestions on upgrading mod_perl
and Perl on a running production machine to limit the amount of down time.

Is it typical to just do a make install with mod_perl on a running
production server, then restart httpd?  Or do people tend to take down the
server, make install to update the Apache::* files, copy the httpd binary
and then restart?

And what about perl5.6?  Have most people been installing it on existing
5.00503, so that @INC also includes the site_perl/5.005 directories or
building a new 5.6 tree and using CPAN autobundle to move and update
modules into the new version?


Thanks,



Bill Moseley
mailto:[EMAIL PROTECTED]



IPC::Shareable problems

2000-09-06 Thread Steven Cotton

Hi,

I've been having some problems delete()'ing elements from a tied
IPC::Shareable hash. The example from the pod works fine (but that's not
running under mod_perl) so I'm wondering if there are any lifetime/scope
issues with using IPC::Shareable 0.51 under mod_perl 1.24. Has anyone had
any "Munged shared memory segment (size exceeded?)" errors when trying to
access (I'm using `exists()') a previously deleted hash element? I see no
examples in the Eagle of deleting tied and shared hash elements (only
under Apache::Registry), and Deja and various newsgroups and web searches
haven't turned up anything. I'm running Apache 1.3.12 and Perl 5.6.0.

Thanks,

-- 
Steven Cotton
[EMAIL PROTECTED]




Re: Embedded Perl/Resource Limits

2000-09-06 Thread Gerald Richter


-
Gerald Richterecos electronic communication services gmbh
Internetconnect * Webserver/-design/-datenbanken * Consulting

Post:   Tulpenstrasse 5 D-55276 Dienheim b. Mainz
E-Mail: [EMAIL PROTECTED] Voice:+49 6133 925131
WWW:http://www.ecos.de  Fax:  +49 6133 925152
-

- Original Message -
From: "Bill Mustdie" [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Wednesday, September 06, 2000 1:40 AM
Subject: Embedded Perl/Resource Limits



 Hi,

  I have a question pertaining to Embedded Perl.
 (But it may be Apache or mod_perl in nature)

 From the example upload script on the Apache Embedded
 Perl page I am implementing a small file upload system
 however I have noticed files cut out when uploading at
 around 1 meg. (Reports "Network Error" with no message
 logged to the log files - anything under a meg works
 no problems)


Maybe you have set (or compiled in) a LimitRequestBody:

http://www.apache.org/docs/mod/core.html#limitrequestbody

Also this creates a temp file, so it maybe a limit, set by your os for the
user Apache is running as, about the max filesize

Gerald


 Is this an Apache or mod_perl limitation?

 And whats the best way of getting around it? Is there
 a simple Apache directive i can put in the config file
 or is there a hard coded patch required?

 thanks in advance!

 Bill

 ps Yes i do know of other methods such as a meta-ftp
 client for files this large but this violates our
 firewall policies etc etc.. :)


 -

 [$ if !defined $fdat{ImageName} $]br

 FORM METHOD="POST" ENCTYPE="multipart/form-data"
   INPUT TYPE="FILE" NAME="ImageName"
   INPUT TYPE="SUBMIT" NAME="Submit"
 VALUE="Upload file"
 /FORM

 [$else$]p

  br
 [-  open FILE, " /tmp/file.$$";
 print FILE $buffer while
 read($fdat{ImageName}, $buffer, 32768);
 close FILE;

 -]
 Your file has been saved to [+ "/tmp/file.$$" +]br


 __
 Do You Yahoo!?
 Yahoo! Mail - Free email you can access from anywhere!
 http://mail.yahoo.com/






Re: IPC::Shareable problems

2000-09-06 Thread Nouguier

Steven Cotton wrote:

 Hi,

 I've been having some problems delete()'ing elements from a tied
 IPC::Shareable hash. The example from the pod works fine (but that's not
 running under mod_perl) so I'm wondering if there are any lifetime/scope
 issues with using IPC::Shareable 0.51 under mod_perl 1.24. Has anyone had
 any "Munged shared memory segment (size exceeded?)" errors when trying to
 access (I'm using `exists()') a previously deleted hash element? I see no
 examples in the Eagle of deleting tied and shared hash elements (only
 under Apache::Registry), and Deja and various newsgroups and web searches
 haven't turned up anything. I'm running Apache 1.3.12 and Perl 5.6.0.

 Thanks,

 --
 Steven Cotton
 [EMAIL PROTECTED]

hi,
 you should try with IPC::ShareLite which provide the same things but seems
better maintained...





Re: HTML Template Comparison Sheet ETA

2000-09-06 Thread Andy Wardley

On Sep 4,  2:46pm, Sam Tregar wrote:
  [% FOREACH thing = list %]
a href="[% thing.url %]"b[% thing.name %]/b/a
  [% END %]

 That isn't really much better, in my opinion.  It's still too much of a
 departure from the HTML around it.

That's the point.  It's not HTML markup so you should make it look
distinctive, IMHO.

Contrast the above to HTML::Template's
   TMPL_LOOP list
  a href="TMPL_VAR url"bTMPL_VAR name/b/A
   /TMPL_LOOP

Barring a little syntax, over which we can agree to differ, that's the
same thing as the simpler example Perrin posted:

[% FOREACH list %]
   a href="[% url %]"b[% name %]/a/a
[% END %]

 With a little education an HTML designer can learn to manipulate the
 template syntax.  You'll have to teach them to program before they can
 deal with a full "foreach" no matter how you dress it up.

Nope, I don't agree, FOREACH and TMPL_LOOP are semantically identical
in these examples.

A



-- 
Andy Wardley [EMAIL PROTECTED]   Signature regenerating.  Please remain seated.
 [EMAIL PROTECTED]   For a good time: http://www.kfs.org/~abw/



Re: Poor man's connection pooling

2000-09-06 Thread Barrie Slaymaker

Michael Peppler wrote:
 
 Based on preliminary tests I was able to use a 1 in 10 ratio of
 database handles per httpd child processes, which, on a large site
 would cut down on the number of connections that the database server
 needs to handle.

I'd be interested to see how this compares with Apache::DBI performance with
MaxRequestsPerChild = 100.  I suspect it's negligable unless you're using
a database like Oracle (with lots of connection establishment overhead) over
a *slow* link.

I don't think there's any significant memory savings here, is there?  Things
you think might share physical memory probably don't after a SQL statement
or two and some perl code gets run.  Oracle certainly changes the RAM in
their connection and cursor handles, for instance.

- Barrie



Re: Poor man's connection pooling

2000-09-06 Thread Perrin Harkins

On Wed, 6 Sep 2000, Jay Strauss wrote:

 Being a database guy but new to Mod_Perl (disclaimer: If these aspects have
 already been implemented and/or talked about please excuse me).

Before going down this road again, I suggest reading the definitive work
on the subject, which is a post from Jeffrey Baker:
http:[EMAIL PROTECTED]

- Perrin




Re: Poor man's connection pooling

2000-09-06 Thread Barrie Slaymaker

Michael Peppler wrote:
 
 The back-end is Sybase. The actual connect time isn't the issue here
 (for me.) It's the sheer number of connections, and the potential
 issue with the number of sockets in CLOSE_WAIT or TIME_WAIT state on
 the database server. We're looking at a farm of 40 front-end servers,
 each runnning ~150 modperl procs. If each of the modperl procs opens
 one connection that's 6000 connections on the database side.

Thanks, that makes more sense.  I was thrown off by the stated advantage being
that connections don't go down when the apache child dies.  I thought it was
a performance-enhancer.

 I'm not worried about RAM usage on the web servers.

Cool, thanks.

- Barrie



Re: upgrading mod_perl on production machine

2000-09-06 Thread Stas Bekman

On Wed, 6 Sep 2000, Bill Moseley wrote:

 Hi,
 
 Some basic questions here:
 
 I hope I didn't miss anything in the Guide at install.html and in
 control.html, but I was looking for any suggestions on upgrading mod_perl
 and Perl on a running production machine to limit the amount of down time.
 
 Is it typical to just do a make install with mod_perl on a running
 production server, then restart httpd?  Or do people tend to take down the
 server, make install to update the Apache::* files, copy the httpd binary
 and then restart?
 
 And what about perl5.6?  Have most people been installing it on existing
 5.00503, so that @INC also includes the site_perl/5.005 directories or
 building a new 5.6 tree and using CPAN autobundle to move and update
 modules into the new version?

I won't do anything on the live server, but installing RPM, or a similar
package tested on another machine. It takes a few seconds to install new
stuff (and remove old) and a few more secs to restart the server -- you
are done.

Another approach is to use CVS, if you have a very fast connection, so you
can ensure that all the data will be updated very fast.


_
Stas Bekman  JAm_pH --   Just Another mod_perl Hacker
http://stason.org/   mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]   http://apachetoday.com http://jazzvalley.com
http://singlesheaven.com http://perlmonth.com   perl.org   apache.org





PUT handling (somewhat off-topic)

2000-09-06 Thread Mark-Jason Dominus


I apologize in advance, because this isn't directly related to
mod_perl.  But I really wasn't sure where to ask.  Posting to
comp.infosystems.www.servers.unix didn't produce any result.  There
doesn't seem to be a mailing list for discussion of Apache generally.

I am trying to get apache to invoke a CGI program in response to PUT
requests.  This is a FAQ.  The FAQ instructions are very clear and
straightforward and don't work for me.

I have the following in the VirtualHost section in my httpd.conf:

   Script PUT /cgi-bin/Put

/cgi-bin is ScriptAliased correctly.  /cgi-bin/Put has permissions set
propserly and runs correctly from the shell and also when I send
Apache a GET request for it.

When I send Apache a PUT request using 'telnet', the request is
received.  However, my PUT script does not run.  Instead, Apache
fabricates a 200 response that looks like this:

HTTP/1.1 200 OK
Date: Tue, 05 Sep 2000 08:57:12 GMT
Server: Apache/1.3.6 (Unix) mod_perl/1.19
Connection: close
Content-Type: text/html

the body of the response is empty.

I know that /cgi-bin/Put isn't being run because it would have
produced a 206 response, not a 200 response,  because it would have produced
a nonempty body, and because it would have written a log to
/tmp/Put.err, which it didn't do.

The access log entry looks like this:

209.152.205.5 - - [05/Sep/2000:04:57:12 -0400] "PUT /~mjd/p/index.html 
HTTP/1.0" 200 0 "-" "-"

There is no entry in the error log.

I get the same behavior when I put the 'Script' directive into a
Directory section and send a PUT request for a file in the
directory.  

I don't want Apache to respond to the PUT request itself.  I want it
to run /cgi-bin/Put and have /cgi-bin/Put generate the response.  The
on-line manual and the FAQ all say that the

   Script PUT /cgi-bin/Put

directive that I have should do that, but it isn't doing it.  Does
anyone have any suggestions about what might be wrong, or about a more
appropriate forum in which to ask?





Redirect Load Balancing Can't Connect

2000-09-06 Thread Kent, Mr. John

Adam,

Thanks for responding.  Yes that seems to be the problem exactly.  Squid,
says it cannot access the URL.  But clicking on the link in the Error page
generated by Squid works fine?

Checking the load of the other server, 199.9.2.16, shows that it never sees
the request.

To see this problem in action click on 

http://mako.nrlmry.navy.mil/tc-bin/tc_home a couple of times or so.

What's in the logs?

For a failed request access log shows

192.160.159.24 - - [06/Sep/2000:10:22:40 -0700] "GET
http://127.0.0.1:/tc-bin/tc_home HTTP/1.1" 503 1079 TCP_MISS:DIRECT
(Here I assume it is being redirected to 199.9.2.16)

The only difference is that when it works (connecting to 127.0.0.1) it says
TCP_IMS_HIT:NONE

store log:

968260960.303 RELEASE   503-1-1-1 unknown
-1/1019 GET http://127.0.0.1:/tc-bin/tc_home

which doesn't seem too helpful

When it does work (again assume connecting to 127.0.0.1) store log looks
like:
968261980.789 RELEASE   200 968261980-1 968261980 text/html
-1/1205 GET http://127.0.0.1:/tc-bin/tc_home
968261981.518 RELEASE   200 968261981-1 968261981 text/html
-1/4265 GET http://127.0.0.1:/tc-bin/tc_list_storms?
968261981.565 RELEASE   304 968189274 968120284-1 image/gif
-1/0 GET http://127.0.0.1:/my_icons/ball.red.gif
968261982.211 RELEASE   304 968188361 966296432-1
application/x-javascript -1/0 GET http://127.0.0.1:/java_scripts/menu.js
968261983.587 RELEASE   200 968261981-1 968262282 text/html
-1/31859 GET http://127.0.0.1:/tc-bin/tc_display?
968261983.587 RELEASE   304 968188798 968171103-1 image/gif
-1/0 GET http://127.0.0.1:/tc_thumbs/smal922000.00090512.gif
968261983.661 RELEASE   200 968261983-1 968262103 image/gif
-1/264 GET http://127.0.0.1:/cgi-bin/Count4.cgi?
968261983.684 RELEASE   304 968261747 968261695-1 image/jpeg
-1/0 GET
http://127.0.0.1:/tc_thumbs/sm2906.1715.goes-8.vis.x.INVEST.92L.jpg
968261983.704 RELEASE   200 968261983-1 968262103 image/gif
-1/277 GET http://127.0.0.1:/cgi-bin/Count4.cgi?
968261983.851 RELEASE   304 968188809 933271467-1 image/gif
-1/0 GET http://127.0.0.1:/images/hbar.gif



And nothing new in the cache log

When it fails the page generated by Squid says:


ERROR
The requested URL could not be retrieved




While trying to retrieve the URL: http://199.9.2.16:/tc-bin/tc_home 

The following error was encountered: 

Connection Failed 
The system returned: 

(126) Cannot assign requested address
The remote host or network may be down. Please try the request again. 

Your cache administrator is [EMAIL PROTECTED] 





Generated Wed, 06 Sep 2000 17:25:53 GMT by mako.nrlmry.navy.mil
(Squid/2.3.STABLE4

Clicking on the URL that Squid says it couldn't connect to works!!!?

So I really don't know how to proceed at this point.  I wish there was
something
that would show me what exactly Squid tried to do which generated this
error.
The exact call it attempted to make. 



Thanks,
John

Here is my redirect.pl:
#!/users/webuser/perl/bin/perl
$| = 1;

# Testing switching between two servers
my(@HEAVY) = qw{199.9.2.16 127.0.0.1};

my($i,$j) = 0;

while(){
   
if ($_ =~ /sat-bin|tc-bin/){

my($value) = $HEAVY[rand(@HEAVY)];
# Rotate servers
s%127\.0\.0\.1(:\d+)?%$value%;

# Specify heavy mod_perl server for cgi scripts
s%(:\d+)?/sat-bin%:/sat-bin%i  next;
s%(\:\d+)?/tc-bin%:/tc-bin%i  next;
}

} continue {
print LOG "$$  $_\n";

print; # falls through to
apache_light
}

###
Here is a sample of the before and after effects of the redirector
for a failed request

URL Before redirector
25154 
http://127.0.0.1:/tc-bin/tc_list_storms?DISPLAY=ActivePROD=track_vi
sTYPE=ssmiYEAR=2000MO=SepACTIVES=00-EPAC-14E.LANE,00-WPAC-22W.SAOMAI,00-
WPAC
-23W.WUKONG,00-WPAC-24W.BOPHA,00-EPAC-90E.INVEST,00-ATL-92L.INVEST,
192.160.159.
24/- - GET

value = 199.9.2.16

URL after redirector, its working as intended.
25154 
http://199.9.2.16:/tc-bin/tc_list_storms?DISPLAY=ActivePROD=track_v
isTYPE=ssmiYEAR=2000MO=SepACTIVES=00-EPAC-14E.LANE,00-WPAC-22W.SAOMAI,00
-WPA
C-23W.WUKONG,00-WPAC-24W.BOPHA,00-EPAC-90E.INVEST,00-ATL-92L.INVEST,
192.160.159
.24/- - GET



Perhaps your desktop machine can access the URL, but the squid server
can't ? What do the squid logs report ?

 - Adam



Re: Poor man's connection pooling

2000-09-06 Thread Jeff Horn

One thing I've looked at doing is adding an LRU mechanism (or some such
discipline) to the existing Apache::DBI.

I've already modified Apache::DBI to use 'reauthenticate' if a DBD
implements it so that only 1 connection needs to be maintained per child.
This greatly improves performance with a 1:1 ratio between Apache children
and database connections.

However, some (most) DBD don't implement this method and in fact some
databases cannot even do a 'reauthenticate' on an existing connection.
Also, there are situations where performance can be greatly improved by a
judicious choice of the # of connections maintained per child.  An example
of this is where one user is used to verify authentication info stored in a
database and then if authenticated a new user id is used to actually do the
work.  In this case keeping 2 connections per Apache child, pins the common
"authenticating" connection and rotates the other connection, while still
keeping the total number of connections quite low.

There are other disciplines besides LRU which might work well.  Keeping a
"Hot Set" of connections by keeping a count of how often they're accessed
might a useful alternative.  I'm sure there's all sorts of interesting
disciplines that could be dreamed up!

I'm looking at potential changes to Apache::DBI which would allow a choice
of discipline (LRU, HotSet, etc.) along with whether or not to use
'reauthenticate' DBD function all configurable from apache config files.
I'd be interested in any input on this course of action!

-- Jeff Horn


 - Original Message -
 From: "Michael Peppler" [EMAIL PROTECTED]
 To: "Barrie Slaymaker" [EMAIL PROTECTED]
 Cc: "Michael Peppler" [EMAIL PROTECTED]; [EMAIL PROTECTED];
 [EMAIL PROTECTED]
 Sent: Wednesday, September 06, 2000 12:24 PM
 Subject: Re: Poor man's connection pooling


  Barrie Slaymaker writes:
Michael Peppler wrote:

 Based on preliminary tests I was able to use a 1 in 10 ratio of
 database handles per httpd child processes, which, on a large site
 would cut down on the number of connections that the database
server
 needs to handle.
   
I'd be interested to see how this compares with Apache::DBI
performance
 with
MaxRequestsPerChild = 100.  I suspect it's negligable unless you're
 using
a database like Oracle (with lots of connection establishment
overhead)
 over
a *slow* link.
 
  The back-end is Sybase. The actual connect time isn't the issue here
  (for me.) It's the sheer number of connections, and the potential
  issue with the number of sockets in CLOSE_WAIT or TIME_WAIT state on
  the database server. We're looking at a farm of 40 front-end servers,
  each runnning ~150 modperl procs. If each of the modperl procs opens
  one connection that's 6000 connections on the database side.
 
  Sybase can handle this, but I'd rather use a lower number, hence the
  pooling.
 
I don't think there's any significant memory savings here, is there?
 Things
you think might share physical memory probably don't after a SQL
 statement
or two and some perl code gets run.  Oracle certainly changes the RAM
 in
their connection and cursor handles, for instance.
 
  I'm not worried about RAM usage on the web servers.
 
  Michael  --
  Michael Peppler -||-  Data Migrations Inc.
  [EMAIL PROTECTED]-||-  http://www.mbay.net/~mpeppler
  Int. Sybase User Group  -||-  http://www.isug.com
  Sybase on Linux mailing list: [EMAIL PROTECTED]
 





Re: Poor man's connection pooling

2000-09-06 Thread Stas Bekman

On Wed, 6 Sep 2000, Perrin Harkins wrote:

 On Wed, 6 Sep 2000, Stas Bekman wrote:
  Just a small correction: 
  
  You can cause pages to become unshared in perl just by writing a variable,
 ^^^
  so it's almost certain to happen sooner or later.
  
  Or for example calling pos() which modifies the variable internals:
  http://perl.apache.org/guide/performance.html#Are_My_Variables_Shared_
 
 If you read a variable in a way that causes it to be converted between a
 numerical value and a string and it hasn't happened before, that will
 change the internal structure and unshare the memory on one or more
 pages.  I'm no perlguts hacker, but I think this is correct.

You are right. I've looked into it with the help of all mighty
Devel::Peek. So what actually happens is this:

Consider this script:
  
  use Devel::Peek;
  my $numerical = 10;
  my $string= "10";
  $|=1;
  
  dump_numerical();
  read_numerical_as_numerical();
  dump_numerical();
  read_numerical_as_string();
  dump_numerical();
  
  dump_string();
  read_string_as_numerical();
  dump_string();
  read_string_as_string();
  dump_string();
  
  sub read_numerical_as_numerical {
  print "\nReading numerical as numerical: ",
  int($numerical), "\n";
  }
  sub read_numerical_as_string {
  print "\nReading numerical as string: ",
  $numerical, "\n";
  }
  sub read_string_as_numerical {
  print "\nReading string as numerical: ",
  int($string), "\n";
  }
  sub read_string_as_string {
  print "\nReading string as string: ",
  $string, "\n";
  }
  sub dump_numerical {
  print "\nDumping a numerical variable\n";
  Dump($numerical);
  }
  sub dump_string {
  print "\nDumping a string variable\n";
  Dump($string);
  }

When running it:

  Dumping a numerical variable
  SV = IV(0x80e74c0) at 0x80e482c
REFCNT = 4
FLAGS = (PADBUSY,PADMY,IOK,pIOK)
IV = 10
  
  Reading numerical as numerical: 10
  
  Dumping a numerical variable
  SV = PVNV(0x810f960) at 0x80e482c
REFCNT = 4
FLAGS = (PADBUSY,PADMY,IOK,NOK,pIOK,pNOK)
IV = 10
NV = 10
PV = 0
  
  Reading numerical as string: 10
  
  Dumping a numerical variable
  SV = PVNV(0x810f960) at 0x80e482c
REFCNT = 4
FLAGS = (PADBUSY,PADMY,IOK,NOK,POK,pIOK,pNOK,pPOK)
IV = 10
NV = 10
PV = 0x80e78b0 "10"\0
CUR = 2
LEN = 28
  
  Dumping a string variable
  SV = PV(0x80cb87c) at 0x80e8190
REFCNT = 4
FLAGS = (PADBUSY,PADMY,POK,pPOK)
PV = 0x810f518 "10"\0
CUR = 2
LEN = 3
  
  Reading string as numerical: 10
  
  Dumping a string variable
  SV = PVNV(0x80e78d0) at 0x80e8190
REFCNT = 4
FLAGS = (PADBUSY,PADMY,NOK,POK,pNOK,pPOK)
IV = 0
NV = 10
PV = 0x810f518 "10"\0
CUR = 2
LEN = 3
  
  Reading string as string: 10
  
  Dumping a string variable
  SV = PVNV(0x80e78d0) at 0x80e8190
REFCNT = 4
FLAGS = (PADBUSY,PADMY,NOK,POK,pNOK,pPOK)
IV = 0
NV = 10
PV = 0x810f518 "10"\0
CUR = 2
LEN = 3

So you can clearly see that if you want the data to be shared (unless
some other variable happen to change its value, and thus cause the
whole page to get dirty) and there is a chance that the same variable
will be acccessed both as a string and numerical value, you have to
access to this variable in both value as in the above example, before
the fork happens. 

_
Stas Bekman  JAm_pH --   Just Another mod_perl Hacker
http://stason.org/   mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]   http://apachetoday.com http://jazzvalley.com
http://singlesheaven.com http://perlmonth.com   perl.org   apache.org





Re: mod_perl security on a shared web server

2000-09-06 Thread Félix C.Courtemanche

In fact, I would like to see something similar to what you sent, but that
would only apply to mod_perl (or any other way toe xecute perl scripts in
apache) since I am also using other languages, databases, etc that would be
somewhat harder to isntall with such a comparmentization.

I am currently taking a look at the safe perl module to see if it can do the
job for me.
I had someone mention ressource restricting modules, especially for the
amount of cpu, ram and time of execution used.  Anyone can direct me
specifically to any of theses (or all of them)?  I can't seem to find one
that is completed and working well.

Please keep in mind that security and optimization are the top 2 priorities
in this adventure :)
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Félix C.Courtemanche . Head Designer
Co-Administrator . Can-Host Networks
http://www.can-host.com
[EMAIL PROTECTED]
-Message d'origine-
De : Jonathan Leto [EMAIL PROTECTED]
À : Félix C.Courtemanche [EMAIL PROTECTED]
Cc : [EMAIL PROTECTED] [EMAIL PROTECTED]
Date : 6 septembre, 2000 03:05
Objet : Re: mod_perl security on a shared web server



I would suggest www.freevsd.org, because what you need is complete
compartmentalization.



 F?lix C.Courtemanche ([EMAIL PROTECTED]) was saying:

 Hello,

 I couldn't find any occurance of this question in the archives, but if it
 does exists, please forward me to it.

 I have been working on a set of Administration Tools for commercial web
 hosting companies for quite some times.  Lately I have been trying to
figure
 out the MOST secure way to host multiple accounts on the same server,
with
 mod_perl enabled AS FAST AS POSSIBLE.

 In the best world, I would have the possibility of:
 - Restricting the opened files by any .pl script to the user's base
 directory.
 - Allowing custom shell commands or not
 - Setting a maximum execution time for a script

 The first directive would be used to prevent anyone from reading the
source
 of another program, wich would allow someone to grab the sensitive data
 stored in configuration files, such as Database Passwords, etc.  It is
the
 MOST important of all and I really must find a solution.  I previously
saw
 some perl wrapper that would only allow files owned by the script's owner
to
 be read.  However, that wrapper greatly reduced the execution speed of
.pl
 and it was not that effective.  Any suggestions?

 The second directive would allow me to specify wether or not a user can
run
 commands that would be passed as shell OR specify what paths are
available
 (only /usr/bin for example)

 Finally, the third directive would allow me to kill any script running
for
 too long or using too much CPU.

 I understand that there is probably no tool to do all of it, but if I can
 gather the tools to make it as effective as possible, it would be really
 usefull for me and others.

 Please don't tell me to monitor the user's scripts, since that is almost
 impossible to do when you have more than 10 sites to monitor, wich will
 happen quickly :)

 Any other tips and tricks to improve the security of mod_perl is greatly
 appreciated as well.

 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
 Félix C.Courtemanche . Head Designer
 Co-Administrator . Can-Host Networks
 http://www.can-host.com
 [EMAIL PROTECTED]



--
[EMAIL PROTECTED]
"With pain comes clarity."






RE: mod_perl security on a shared web server

2000-09-06 Thread Christian Gilmore

Felix,

There's not much available that is efficient and does per-resource
throttling based upon CPU, RAM, and time of which I know. I looked around
for such things about 8 months ago.

I instead decided that, for my needs, limiting simultaneous client access
to resource hogs was good enough. I wrote mod_throttle_access to serve
this purpose. It is available through the Apache Module Registry or
directly here:

http://www.fremen.org/apache/

Regards,
Christian

From: Félix C.Courtemanche [mailto:[EMAIL PROTECTED]]
 I had someone mention ressource restricting modules,
 especially for the
 amount of cpu, ram and time of execution used.  Anyone can direct me
 specifically to any of theses (or all of them)?  I can't seem
 to find one
 that is completed and working well.





@INC variable

2000-09-06 Thread Roee Rubin

Hello,

I have written a sample package (from a book). The
package uses an Apache package - Constants.pm that its
path is not included in the @INC variable. I would
like to add the path to the variable by default and
not in runtime as people have suggessted.

Where are the default values of @INC stored ??

Thanks in advance.
[EMAIL PROTECTED]

=
The error I recieved ...


Can't locate Apache/Constants.pm in @INC (@INC
contains: (...)



hello.pm
==

package Apache::Hello;
use strict;
use Apache::Constants qw(:common);

sub handler {

my $r = shift;
$r-content_type('text/html');
$r-send_http_header;
my $host = $r-get_remote_host;
$r-print(END);
HTML
HEAD
TITLEHello There/TITLE
/HEAD
BODY
H1Hello $host/H1
Testing 123
/BODY
/HTML
END
return OK;
}

1;

__
Do You Yahoo!?
Yahoo! Mail - Free email you can access from anywhere!
http://mail.yahoo.com/



Re: PUT handling (somewhat off-topic)

2000-09-06 Thread Steve van der Burg

When I send Apache a PUT request using 'telnet', the request is
received.  However, my PUT script does not run.  Instead, Apache
fabricates a 200 response that looks like this:

I just added
   Script PUT /cgi-bin/put-handler
to my Apache config (apache 1.3.12  mod_perl 1.24 on Solaris 8 SPARC),
copied http://www.apacheweek.com/issues/put1 to put-handler, added
some more logging code, and tried uploading something from
Netscape Composer.

It worked like a charm, the first time, and the request was handled by
the script (the script's own log says what I expected it to say) which
means I've been of almost no help!

If it hadn't worked, I probably would've trussed Apache while I made the
request to see what was going on.

...Steve


-- 
Steve van der Burg
Information Services
London Health Sciences Centre
(519) 685-8300 ext 35559
[EMAIL PROTECTED]




PUT handling (somewhat off-topic)

2000-09-06 Thread mjd-perl-modperl


I apologize in advance, because this isn't directly related to
mod_perl.  But I really wasn't sure where to ask.  Posting to
comp.infosystems.www.servers.unix didn't produce any result.  There
doesn't seem to be a mailing list for discussion of Apache generally.

I am trying to get apache to invoke a CGI program in response to PUT
requests.  This is a FAQ.  The FAQ instructions are very clear and
straightforward and don't work for me.

I have the following in the VirtualHost section in my httpd.conf:

   Script PUT /cgi-bin/Put

/cgi-bin is ScriptAliased correctly.  /cgi-bin/Put has permissions set
propserly and runs correctly from the shell and also when I send
Apache a GET request for it.

When I send Apache a PUT request using 'telnet', the request is
received.  However, my PUT script does not run.  Instead, Apache
fabricates a 200 response that looks like this:

HTTP/1.1 200 OK
Date: Tue, 05 Sep 2000 08:57:12 GMT
Server: Apache/1.3.6 (Unix) mod_perl/1.19
Connection: close
Content-Type: text/html

the body of the response is empty.

I know that /cgi-bin/Put isn't being run because it would have
produced a 206 response, not a 200 response,  because it would have produced
a nonempty body, and because it would have written a log to
/tmp/Put.err, which it didn't do.

The access log entry looks like this:

209.152.205.5 - - [05/Sep/2000:04:57:12 -0400] "PUT /~mjd/p/index.html 
HTTP/1.0" 200 0 "-" "-"

There is no entry in the error log.

I get the same behavior when I put the 'Script' directive into a
Directory section and send a PUT request for a file in the
directory.  

I don't want Apache to respond to the PUT request itself.  I want it
to run /cgi-bin/Put and have /cgi-bin/Put generate the response.  The
on-line manual and the FAQ all say that the

   Script PUT /cgi-bin/Put

directive that I have should do that, but it isn't doing it.  Does
anyone have any suggestions about what might be wrong, or about a more
appropriate forum in which to ask?






Re: PUT handling (somewhat off-topic)

2000-09-06 Thread Mark-Jason Dominus


  If it hadn't worked, I probably would've trussed Apache while I made the
  request to see what was going on.
 
 I guess I'll try that, but I'm not expecting much.

That was the right thing to do.  The problem became apparent right
away: I had another handler installed for a parent directory of the
one I was trying to enable 'PUT' for.

Thanks very much.  Also thanks to Frank Wiles, who pointed my
attention at the relevant part of the manual.




beginner mod_perl error

2000-09-06 Thread Roee Rubin

I have been able to correct the @INC path issue and
now have ran into the following error that is
displayed in the error_log


null: Undefined subroutine Apache::Hello::handler
called


Any help will be appreciated.


[EMAIL PROTECTED]

Hello,

I have written a sample package (from a book). The
package uses an Apache package - Constants.pm that its
path is not included in the @INC variable. I would
like to add the path to the variable by default and
not in runtime as people have suggessted.

Where are the default values of @INC stored ??

Thanks in advance.
[EMAIL PROTECTED]

=
The error I recieved ...


Can't locate Apache/Constants.pm in @INC (@INC
contains: (...)



hello.pm
==

package Apache::Hello;
use strict;
use Apache::Constants qw(:common);

sub handler {

my $r = shift;
$r-content_type('text/html');
$r-send_http_header;
my $host = $r-get_remote_host;
$r-print(END);
HTML
HEAD
TITLEHello There/TITLE
/HEAD
BODY
H1Hello $host/H1
Testing 123
/BODY
/HTML
END
return OK;
}

1;

__
Do You Yahoo!?
Yahoo! Mail - Free email you can access from anywhere!
http://mail.yahoo.com/



[OT] FYI: RSA Released Into The Public Domain

2000-09-06 Thread Stas Bekman


This info is relevant to those mod_perl users who run mod_ssl and
equivalent products in USA.

The RSA algorithm was released into the public domain today (September
6th, 2000). For more info see:

http://www.rsasecurity.com/news/pr/000906-1.html
http://www.rsasecurity.com/developers/total-solution/faq.html

As of this moment the above site is slashdotted, (i.e. unreachable), so I
guess you want to try later :)

P.S. Please refrain from creating a thread from this post, be patient and
read the info at the above links and more info at ./ :
http://slashdot.org/articles/00/09/06/1252204.shtml 
-- Thanks a lot!

_
Stas Bekman  JAm_pH --   Just Another mod_perl Hacker
http://stason.org/   mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]   http://apachetoday.com http://jazzvalley.com
http://singlesheaven.com http://perlmonth.com   perl.org   apache.org





Help for installation....

2000-09-06 Thread Derrick

Dear all,
Does anybody know how to fix the following error?
I am sure i already have the license file directive in the configuration
file.  Everything runs fine except when i was trying to install this
mod-perl.  I am using Freebsd4 and stronghold.  I got the following error
when i ran "make test" after "make" for mod-perl.



Skip blib/arch/auto/Apache/include/include/compat.h (unchanged)
/usr/local/stronghold/src/httpsd -f `pwd`/t/conf/httpd.conf -X -d `pwd`/t 
httpd listening on port 8529
will write error_log to: t/logs/error_log
LICENSE: No StrongholdLicenseFile directive
Add a StrongholdLicenseFile directive into your configuration file
giving the file containing your license block.
letting apache warm up...\c
To get a license block, see http://www.int.c2.net/stronghold/lbfail
done
/usr/bin/perl t/TEST 0
still waiting for server to warm up...not ok
server failed to start! (please examine t/logs/error_log) at t/TEST line 95.
*** Error code 61

Stop in /usr/derrick/modperl.

===
Thanks,

Derrick




Re: Help for installation....

2000-09-06 Thread Pramod Sokke

Derrick,

We've all had the same problem with running 'make test' under stronghold.
All my queries to the Stronghold folk about this went unanswered.
Carl Hansen detailed me about the steps to go through for a successful
installation and advised not to run 'make test'. I followed exactly that and
everything's working fine for me.

Here's what I followed:

install stronghold via their script into /export/home/strongholdtest
then
cd /export/home
zcat mod_perl-1.24.tar.gz | tar xvf -
cd mod_perl-1.24
perl Makefile.PL APACHE_SRC=../strongholdtest/src/ DO_HTTPD=1 USE_APACI=0
EVERYTHING=1
make

Note well USE_APACI=0 ie dont use it. Stronghold seems to be set up to build
using
the older system. You may have to go into ../strongholdtest/src/ and make
sure
Configuration is the way you want it.

The other complication is that Stronghold comes with an installation of perl
in it's
own directories. I didn't want to use theirs. Look in bin. They have a
wrapper script
'shperl' which just points to their version. I edited it to point to the
right version.
May have something to do with the 'make test' problem:
I suggest you do ' httpd -l ' to see that the modules you want are
indeed compiled in. Then , I just skipped 'make test', moved the httpd to
the right
directory, and just start using it.  If it works that is sufficient test.

You could debug their test script , but it is a waste of time.


Hope this helps!
-Pramod



- Original Message -
From: Derrick [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Wednesday, September 06, 2000 2:28 PM
Subject: Help for installation


 Dear all,
 Does anybody know how to fix the following error?
 I am sure i already have the license file directive in the configuration
 file.  Everything runs fine except when i was trying to install this
 mod-perl.  I am using Freebsd4 and stronghold.  I got the following error
 when i ran "make test" after "make" for mod-perl.



 
 Skip blib/arch/auto/Apache/include/include/compat.h (unchanged)
 /usr/local/stronghold/src/httpsd -f `pwd`/t/conf/httpd.conf -X -d `pwd`/t

 httpd listening on port 8529
 will write error_log to: t/logs/error_log
 LICENSE: No StrongholdLicenseFile directive
 Add a StrongholdLicenseFile directive into your configuration file
 giving the file containing your license block.
 letting apache warm up...\c
 To get a license block, see http://www.int.c2.net/stronghold/lbfail
 done
 /usr/bin/perl t/TEST 0
 still waiting for server to warm up...not ok
 server failed to start! (please examine t/logs/error_log) at t/TEST line
95.
 *** Error code 61

 Stop in /usr/derrick/modperl.


 ===
 Thanks,

 Derrick





Re: Embedded Perl/Resource Limits

2000-09-06 Thread Bill Mustdie


Gerald,


 But isnt the LimitRequestBody directive  only for 
files being received by the client and not for files 
being sent the other way??

I thought this would be an arbitrary limit that could
either be changed in a source code or apache config
file change??

Any ideas?

Bill


--- Gerald Richter [EMAIL PROTECTED] wrote:
 

-
 Gerald Richterecos electronic communication
 services gmbh
 Internetconnect * Webserver/-design/-datenbanken *
 Consulting
 
 Post:   Tulpenstrasse 5 D-55276 Dienheim
 b. Mainz
 E-Mail: [EMAIL PROTECTED] Voice:+49
 6133 925131
 WWW:http://www.ecos.de  Fax:  +49
 6133 925152

-
 
 - Original Message -
 From: "Bill Mustdie" [EMAIL PROTECTED]
 To: [EMAIL PROTECTED]
 Sent: Wednesday, September 06, 2000 1:40 AM
 Subject: Embedded Perl/Resource Limits
 
 
 
  Hi,
 
   I have a question pertaining to Embedded Perl.
  (But it may be Apache or mod_perl in nature)
 
  From the example upload script on the Apache
 Embedded
  Perl page I am implementing a small file upload
 system
  however I have noticed files cut out when
 uploading at
  around 1 meg. (Reports "Network Error" with no
 message
  logged to the log files - anything under a meg
 works
  no problems)
 
 
 Maybe you have set (or compiled in) a
 LimitRequestBody:
 

http://www.apache.org/docs/mod/core.html#limitrequestbody
 
 Also this creates a temp file, so it maybe a limit,
 set by your os for the
 user Apache is running as, about the max filesize
 
 Gerald
 
 
  Is this an Apache or mod_perl limitation?
 
  And whats the best way of getting around it? Is
 there
  a simple Apache directive i can put in the config
 file
  or is there a hard coded patch required?
 
  thanks in advance!
 
  Bill
 
  ps Yes i do know of other methods such as a
 meta-ftp
  client for files this large but this violates our
  firewall policies etc etc.. :)
 
 
  -
 
  [$ if !defined $fdat{ImageName} $]br
 
  FORM METHOD="POST"
 ENCTYPE="multipart/form-data"
INPUT TYPE="FILE" NAME="ImageName"
INPUT TYPE="SUBMIT" NAME="Submit"
  VALUE="Upload file"
  /FORM
 
  [$else$]p
 
   br
  [-  open FILE, " /tmp/file.$$";
  print FILE $buffer while
  read($fdat{ImageName}, $buffer, 32768);
  close FILE;
 
  -]
  Your file has been saved to [+ "/tmp/file.$$"
 +]br
 
 
  __
  Do You Yahoo!?
  Yahoo! Mail - Free email you can access from
 anywhere!
  http://mail.yahoo.com/
 
 
 


__
Do You Yahoo!?
Yahoo! Mail - Free email you can access from anywhere!
http://mail.yahoo.com/



[ANNOUNCE]: aphid 0.10a

2000-09-06 Thread eScout Corporation


aphid -- apache/perl http installation daemon

 - first public alpha release
 - Aphid-specific code is licensed under the same terms 
   as Perl itself
 - see the distribution doc for more information

from the README:

  Aphid provides a quick facility for bootstrapping SSL-enabled 
  Apache web servers (mod_ssl) with an embedded Perl interpreter 
  (mod_perl).  Source is downloaded from the Internet, compiled, 
  and the resulting software is installed in the directory you 
  specify.  Aphid's installation methodology strays from standard 
  practice with emphases on providing an intuitive, accessible 
  interface and keeping a tiny distribution footprint.

from the INSTALL:

  To date Aphid has been tested on Rehat Linux 6 and 6.2, FreeBSD 
  4.0, and Solaris 2.6 and 7.  In theory the target machine just 
  needs a connection to the Internet, Perl 5, and the gcc or equivalent
  compiler + the usual tools.

distribution, documentation, etc. available at:
  
  http://sourceforge.net/projects/aphid/

Your feedback is appreciated.  Please send all bugs
to [EMAIL PROTECTED]

Thanks  - the eScout developoment team




Undefined subroutine Apache::Foo::handler

2000-09-06 Thread Roee Rubin

Hello,

I am receiving the following error (viewed by the
error log) when I attempt to view the page


Undefined subroutine Apache::Foo::handler

I have also attempted precompile the package ( perl -c
Foo.pm) and recieve the following error:

Can't locate Apache/File.pm in @INC .. or Can't
locate Apache/Constants.pm in @INC ...


Any help would be appreciated.

[EMAIL PROTECTED]




__
Do You Yahoo!?
Yahoo! Mail - Free email you can access from anywhere!
http://mail.yahoo.com/



Passing STDIN to subprogram

2000-09-06 Thread erich oliphant

I am replacing a CGI shell script with a modperl script.  At one point in 
the shell script subprogram is called.  The HTML form that calls the script 
calls it via a POST.  As such the params are available via STDIN.  The 
subprogram call (which I can't eliminate yet) expects to see the form 
params.  However when I call it from the perl script STDIN is empty.  I have 
tried backticks and the open call (which are supposed to inherit STDIN) to 
no avail.

Any suggestions?

Erich
_
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.

Share information about yourself, create your own public profile at 
http://profiles.msn.com.




Re: upgrading mod_perl on production machine

2000-09-06 Thread Perrin Harkins

On Wed, 6 Sep 2000, Bill Moseley wrote:
 I hope I didn't miss anything in the Guide at install.html and in
 control.html, but I was looking for any suggestions on upgrading mod_perl
 and Perl on a running production machine to limit the amount of down time.

We use RPMs.  Some form of package, even if it's just a tarball, is a good
idea.  Build it on a different server and then just install it.  Having
multiple servers really comes in handy here because you can take some off
line, upgrade them while the others are live, and then switch.  Then your
site remains up the whole time.

- Perrin




Re: PUT handling (somewhat off-topic)

2000-09-06 Thread Mark-Jason Dominus


 It worked like a charm, the first time, 

Apparently it works like a charm for everyone but me, since none of
the instructions I've found on the net have admitted the possibility
that anything can go wrong.

Which is why I came here, to bother the experts.

 If it hadn't worked, I probably would've trussed Apache while I made the
 request to see what was going on.

I guess I'll try that, but I'm not expecting much.




Re: internal_redirect

2000-09-06 Thread Differentiated Software Solutions Pvt. Ltd

Hi,

We changed the code as you had given. Still we get the same message.

Lots of others have told us that we can only run it under mod_perl. Fine. We
realise this. When we run it under mod_perl we got this message in Apache
error log. Hence we ran it under perl.

We feel that there is some basic thing we are missing out. It seems as if
when perl tries to link up to Apache.pm it is not able to recognize the the
method "request". We are using Apache 1.3.6, perl ver 5.0005 and mod_perl
version 1.21
Is there a problem with these versions.
Should we enable anything while compiling in mod_perl

Thanks for any help.

Muthu Ganesh

ps. I'm sorry if we have offended anybody. It's not our intention to cook up
syntax !! We are making sincere attempts to understand why something is not
working. If somebody feels that these questions are below their level, then
please ignore the same.

- Original Message -
From: Ken Williams [EMAIL PROTECTED]
To: Differentiated Software Solutions Pvt. Ltd [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Tuesday, September 05, 2000 8:01 PM
Subject: Re: internal_redirect


 [EMAIL PROTECTED] (Differentiated Software Solutions Pvt. Ltd) wrote:
 We corrected R to r. Problem still remains.
 We ran this program as a standalone perl program and even this bombs.
Code
 as follows.
 
 #!/usr/bin/perl
 my $r;
 use Apache ();
 
 Apache-request($r);
 
 $r-internal_redirect('hello.html');
 
 Error message : Can't locate object method "request" via package "Apache"
at
 ../test1.pl line 5.


 As others have mentioned, you can't run this code standalone without
 using some tricks (though they're not very tricky).  But you've got a
 different problem.  According to your code, $r is never assigned to, so
 it should fail with a different error than you're seeing anyway.  You
 want something like this:

#!/usr/bin/perl
use Apache ();

my $r = Apache-request;

$r-internal_redirect('/path/to/hello.html');


   ------
   Ken Williams Last Bastion of Euclidity
   [EMAIL PROTECTED]The Math Forum





Re: Poor man's connection pooling

2000-09-06 Thread Michael Peppler

Perrin Harkins writes:
  On Tue, 5 Sep 2000, Michael Peppler wrote:
   I've come across a technique that allows modperl processes to share a 
   pool of database handles. It's not something that I have seen
   documented, so I figured I'd throw it out here.
   
   The idea is to create a pool of connections during the main
   apache/modperl startup. Because these connections end up in the code
   segment for the child processes they are completely shareable. You
   populate a hash in a BEGIN block with x connections, and then provide
   some calls to grab the first available one (I use IPC::Semaphore to
   coordinate access to each connection).
  
  People have suggested this before on the mod_perl list and the objection
  raised was that this will fail for the same reason it fails to open a
  filehandle in the parent process and then use it from all the children.  
  Basically, it becomes unshared at some point and even if they don't do
  things simultaneously one process will leave the socket in a state that
  the other doesn't expect and cause problems.  You can cause pages to
  become unshared in perl just by reading a variable, so it's almost certain
  to happen sooner or later.

Yes, that's what I figured too. But in my tests, involving thousands
of database calls, I haven't seen any problems (yet).

  Can you try this some more and maybe throw some artificial loads against
  it to look for possible problems?  It would be cool if this worked, but
  I'm very skeptical until I see it handle higher concurrency without any
  problems.

I will *definitely* throw as much load as I can in a test/stress
environment before I make use of this in a production environment.

I'll let y'all know how this goes...

Michael
-- 
Michael Peppler -||-  Data Migrations Inc.
[EMAIL PROTECTED]-||-  http://www.mbay.net/~mpeppler
Int. Sybase User Group  -||-  http://www.isug.com
Sybase on Linux mailing list: [EMAIL PROTECTED]



Re: Poor man's connection pooling

2000-09-06 Thread Perrin Harkins

On Wed, 6 Sep 2000, Stas Bekman wrote:
 Just a small correction: 
 
 You can cause pages to become unshared in perl just by writing a variable,
^^^
 so it's almost certain to happen sooner or later.
 
 Or for example calling pos() which modifies the variable internals:
 http://perl.apache.org/guide/performance.html#Are_My_Variables_Shared_

If you read a variable in a way that causes it to be converted between a
numerical value and a string and it hasn't happened before, that will
change the internal structure and unshare the memory on one or more
pages.  I'm no perlguts hacker, but I think this is correct.

- Perrin




Re: Poor man's connection pooling

2000-09-06 Thread Tim Bunce

On Tue, Sep 05, 2000 at 10:38:48AM -0700, Michael Peppler wrote:
 
 The idea is to create a pool of connections during the main
 apache/modperl startup. [...]
 
 This technique works with Sybase connections using either
 DBI/DBD::Sybase or Sybase::CTlib (I've not tested Sybase::DBlib, nor
 any other DBD modules).

For some drivers, like DBD::Oracle, connections generally don't work
right across forks, sadly. MySQL is okay. Not sure about others.

Tim.



Re: Poor man's connection pooling

2000-09-06 Thread Stas Bekman

On Tue, 5 Sep 2000, Perrin Harkins wrote:

 On Tue, 5 Sep 2000, Michael Peppler wrote:
  I've come across a technique that allows modperl processes to share a 
  pool of database handles. It's not something that I have seen
  documented, so I figured I'd throw it out here.
  
  The idea is to create a pool of connections during the main
  apache/modperl startup. Because these connections end up in the code
  segment for the child processes they are completely shareable. You
  populate a hash in a BEGIN block with x connections, and then provide
  some calls to grab the first available one (I use IPC::Semaphore to
  coordinate access to each connection).
 
 People have suggested this before on the mod_perl list and the objection
 raised was that this will fail for the same reason it fails to open a
 filehandle in the parent process and then use it from all the children.  
 Basically, it becomes unshared at some point and even if they don't do
 things simultaneously one process will leave the socket in a state that
 the other doesn't expect and cause problems.  You can cause pages to
 become unshared in perl just by reading a variable, so it's almost certain
 to happen sooner or later.

Just a small correction: 

You can cause pages to become unshared in perl just by writing a variable,
   ^^^
so it's almost certain to happen sooner or later.

Or for example calling pos() which modifies the variable internals:
http://perl.apache.org/guide/performance.html#Are_My_Variables_Shared_

 Can you try this some more and maybe throw some artificial loads against
 it to look for possible problems?  It would be cool if this worked, but
 I'm very skeptical until I see it handle higher concurrency without any
 problems.
 
 - Perrin
 
 



_
Stas Bekman  JAm_pH --   Just Another mod_perl Hacker
http://stason.org/   mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]   http://apachetoday.com http://jazzvalley.com
http://singlesheaven.com http://perlmonth.com   perl.org   apache.org





cvs commit: modperl-site/embperl Changes.pod.1.html

2000-09-06 Thread richter

richter 00/09/05 23:25:02

  Modified:embperl  Changes.pod.1.html
  Log:
  Embperl Webpages - Changes
  
  Revision  ChangesPath
  1.172 +1 -1  modperl-site/embperl/Changes.pod.1.html
  
  Index: Changes.pod.1.html
  ===
  RCS file: /home/cvs/modperl-site/embperl/Changes.pod.1.html,v
  retrieving revision 1.171
  retrieving revision 1.172
  diff -u -r1.171 -r1.172
  --- Changes.pod.1.html2000/08/25 05:31:38 1.171
  +++ Changes.pod.1.html2000/09/06 06:25:02 1.172
  @@ -21,7 +21,7 @@
   
   [a href="" HOME/a]nbsp;nbsp; [a 
href="Changes.pod.cont.html"CONTENT/a]nbsp;nbsp; [a 
href="Changes.pod.cont.html"PREV (Revision History - Content)/a]nbsp;nbsp; [a 
href="Changes.pod.2.html"NEXT (1.3b5 (BETA)  20. Aug 2000)/a]nbsp;nbsp; brhr
   P
  -Last Update: Fri Aug 25 07:32:05 2000 (MET)
  +Last Update: Wed Sep 6 08:25:46 2000 (MET)
   
   P
   NOTE: This version is only available via A HREF="CVS.pod.1.html#INTRO" "CVS"/A