Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Gunther Birznieks

I think I may be a bit dense on this list so forgive me if I try to clarify 
(at least for myself to make sure I have this right)...

I think what you are proposing is not that much different from the proxy 
front-end model. The mod_proxy is added overhead, but that solves your 
memory problem. You can have 50 apache processes on the front-end dealing 
with images and the like and then have only 2 or 5 or however many 
Apache/Perl processes on the backend.

The only inefficiency with this is that HTTP is the protocol being used for 
the front-end HTTPD daemon to communicate with Perl instead of a direct 
socket using a binary/compressed data protocol.

By the way, if you really prefer this out of process yet still a pool of 
Perl interpreters model, you could always consider purchasing Binary 
Evolution's Velocigen product for Netscape for UNIX. I believe they have a 
mode that allows the Perl engine to run out-of-process with a lightweight 
NSAPI wrapper talking to Perl.

It turns out that this is probably the best way to deal with a buggy 
product like Netscape anyway... NSAPI is such a flakey beast that it's no 
wonder that a company would want to separate the application processes out 
(but now I am getting out of topic).

It's likely that this is a faster solution that the mod_proxy solution 
mod_perl uses because mod_proxy and HTTP are both relatively complex and 
designed to do more than provide back-end application server communications.

Here's the relevant Velocigen URL:

http://www.binaryevolution.com/velocigen/arch.vet

However, I would caution that really mod_perl speeds things up SO much as 
it is, that this architectural improvement over using front-end/back-end 
apache servers is really probably not going to make that big a deal unless 
you are writing something that will be under some really really heavy 
stress. And, of course, you should do your own benchmarking to see if this 
is the case.

While you are at it, you might consider PerlEx from ActiveState. As that 
provide in-process thread-pooled Perl engines that run in the IIS memory 
space.

But again, I would stress that speed isn't the only thing. Think about 
reliability. I think the mod_perl model tends to be more reliable (in the 
front/backend scenario) because the apache servers can be monitored to die 
off independently when they spin out of control.. and they can't pollute 
each other's memory space.  Using some mod_rewrite rules, you can also 
limit which applications are partitioned from each other in which back-end 
services as well very easily.

I don't know how easily you can specify what I would term 
application-affinities in the Velocigen or PerlEx model based on URL alone.

Anyway, good luck with your search for information...

Thanks,
 Gunther

At 10:46 PM 4/15/00 +, [EMAIL PROTECTED] wrote:
>Perrin-
>On Sat, Apr 15, 2000 at 11:33:15AM -0700, Perrin Harkins wrote:
> > > Each process of apache has
> > > it's registry which holds the compiled perl scripts in..., a copy of
> > > each for each process.  This has become an issue for one of the
> > > companies that I work for, and I noted from monitoring the list that
> > > some people have apache processes that are upwards of 25Megs, which is
> > > frankly ridiculous.
> >
> > I have processes that large, but more than 50% of that is shared through
> > copy-on-write.
> >
> > > I wrote a very small perl engine
> > > for phhttpd that worked within it's threaded paradigm that sucked up a
> > > neglibible amount of memory which used a very basic version of
> > > Apache's registry.
> >
> > Can you explain how this uses less memory than mod_perl doing the same
> > thing?  Was it just that you were using fewer perl interpreters?  If 
> so, you
> > need to improve your use of apache with a multi-server setup.  The only way
> > I could see phttpd really using less memory to do the same work is if you
> > somehow managed to get perl to share more of its internals in memory.  Did
> > you?
>
>Yep very handily I might add ;-).  Basically phhttpd is not process
>based, it's threaded based.  Which means that everything is running
>inside of the same address space.  Which means 100% sharing except for
>the present local stack of variables... which is very minimal.  In
>terms of the perl thing... when you look at your processes and see all
>that non-shared memory, most of that is stack variables.  Now most
>webservers are running on single processor machines, so they get no
>benefit from having 10s or even 100s of copies of these perl stack
>variables.  Its much more efficient to have a single process handle
>all the perl requests.  On a multiprocessor box that single process
>could have multiple threads in order to take advantage of the
>processors.  See..., mod_perl stores the stack state of every script
>it runs in the apache process... for every script... copies of it,
>many many copies of it.  This is not efficient.  What would be
>efficient is to have as many threads/proces

Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Stas Bekman

On Sat, 15 Apr 2000 [EMAIL PROTECTED] wrote:

> > > I wrote a very small perl engine
> > > for phhttpd that worked within it's threaded paradigm that sucked up a
> > > neglibible amount of memory which used a very basic version of
> > > Apache's registry.
> > 
> > Can you explain how this uses less memory than mod_perl doing the same
> > thing?  Was it just that you were using fewer perl interpreters?  If so, you
> > need to improve your use of apache with a multi-server setup.  The only way
> > I could see phttpd really using less memory to do the same work is if you
> > somehow managed to get perl to share more of its internals in memory.  Did
> > you?
> 
> Yep very handily I might add ;-).  Basically phhttpd is not process
> based, it's threaded based.  Which means that everything is running
> inside of the same address space.  Which means 100% sharing except for
> the present local stack of variables... which is very minimal.  In
> terms of the perl thing... when you look at your processes and see all
> that non-shared memory, most of that is stack variables.  Now most
> webservers are running on single processor machines, so they get no
> benefit from having 10s or even 100s of copies of these perl stack
> variables.  Its much more efficient to have a single process handle
> all the perl requests.  On a multiprocessor box that single process
> could have multiple threads in order to take advantage of the
> processors.  See..., mod_perl stores the stack state of every script
> it runs in the apache process... for every script... copies of it,
> many many copies of it.  This is not efficient.  What would be
> efficient is to have as many threads/processes as you have processors
> for the mod_perl engine.  In other words seperate the engine from the
> apache process so that there is never unneccesary stack variables
> being tracked.

I'm not sure you are right by claiming that the best performance will be
achieved when you have a single process/thread per given processor. This
would be true *only* if the nature of your code would be CPU bound. 
Unfortunately there are various IO operations and communications with
other components like RDBMS engines, which in turn have their IO as well. 
Given that your CPU is idle while the IO operation is under process, you
could use the CPU for processing another request at this time. 

Hmm, that's the whole point of the multi-process OS. Unless I
misunderstood your suggestion, what you are offering is a kinda DOS-like
OS where there is only one process that occupies CPU at any given time.
(well, assuming that the rest of the OS essential processes are running
somewhere too in a multi-processes environment.)

__
Stas Bekman | JAm_pH--Just Another mod_perl Hacker
http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
--




Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread shane

Perrin-
On Sat, Apr 15, 2000 at 11:33:15AM -0700, Perrin Harkins wrote:
> > Each process of apache has
> > it's registry which holds the compiled perl scripts in..., a copy of
> > each for each process.  This has become an issue for one of the
> > companies that I work for, and I noted from monitoring the list that
> > some people have apache processes that are upwards of 25Megs, which is
> > frankly ridiculous.
> 
> I have processes that large, but more than 50% of that is shared through
> copy-on-write.
> 
> > I wrote a very small perl engine
> > for phhttpd that worked within it's threaded paradigm that sucked up a
> > neglibible amount of memory which used a very basic version of
> > Apache's registry.
> 
> Can you explain how this uses less memory than mod_perl doing the same
> thing?  Was it just that you were using fewer perl interpreters?  If so, you
> need to improve your use of apache with a multi-server setup.  The only way
> I could see phttpd really using less memory to do the same work is if you
> somehow managed to get perl to share more of its internals in memory.  Did
> you?

Yep very handily I might add ;-).  Basically phhttpd is not process
based, it's threaded based.  Which means that everything is running
inside of the same address space.  Which means 100% sharing except for
the present local stack of variables... which is very minimal.  In
terms of the perl thing... when you look at your processes and see all
that non-shared memory, most of that is stack variables.  Now most
webservers are running on single processor machines, so they get no
benefit from having 10s or even 100s of copies of these perl stack
variables.  Its much more efficient to have a single process handle
all the perl requests.  On a multiprocessor box that single process
could have multiple threads in order to take advantage of the
processors.  See..., mod_perl stores the stack state of every script
it runs in the apache process... for every script... copies of it,
many many copies of it.  This is not efficient.  What would be
efficient is to have as many threads/processes as you have processors
for the mod_perl engine.  In other words seperate the engine from the
apache process so that there is never unneccesary stack variables
being tracked.

Hmm... can I explain this better.  Let me try.  Okay, for every apache
proccess there is an entire perl engine with all the stack variables
for every script you run recorded there.  What I'm proposing is a
system where by there would be a seperate process that would have only
a perl engine in it... you would make as many of these processes as
you have processors.  (Or multithread them... it doesn't really
matter)  Now your apache processes would not have a bunch of junk
memory in them.  Your apache processes would be the size of a stock
apache process, like 4-6M or so, and you would have 1 process that
would be 25MB or so that would have all your registry in it.  For a
high capacity box this would be an incredible boon to increasing
capacity.  (I'm trying to explain clearly, but I'd be the first to
admit this isn't one of my strong points)

As to how the multithreaded phhttpd can handle tons of load, well...
that's a seperate issue and frankly a question much better handled by
Zach.  I understand it very well, but I don't feel that I could
adequately explain it.  Its based on real time sig_queue software
technology... for a "decent" reference on this you can take a look at
a book by Oreily called "POSIX.4 Programming for the Real World".  I
should say that this book doesn't go into enough depth... but it's the
only book that goes into any depth that I could find.

> 
> > What I'm
> > thinking is essentially we take the perl engine which has the apache
> > registry and all the perl symbols etc., and seperate it into it's own
> > process which would could be multithreaded (via pthreads) for multiple
> > processor boxes.  (above 2 this would be beneficial probably)  On the
> > front side the apache module API would just connect into this other
> > process via shared memory pages (shmget et. al), or Unix pipes or
> > something like that.
> 
> This is how FastCGI, and all the Java servlet runners (JServ, Resin, etc.)
> work.  The thing is, even if you run the perl interpreters in a
> multi-threaded process, it still needs one interpreter per perl thread and I
> don't know how much you'd be able to share between them.  It might not be
> any smaller at all.

But there is no need to have more than one perl thread per processor.
Right now we have a perl "thread" (er.. engine is a better term) per
process.  Since most boxes start up 10 processes or so of Apache we'd
be talking about a memory savings something like this:
6MB stock apache process
25MB (we'll say that's average) mod_perl apache process 50% shared,
leaving 12.5 MB non shared
The way it works now: 12.5 * 10=125MB + 12.5 (shared bit one
instance)= 147.5 MB total.
Suggested way:
6MB stock with about 3MB shared or so.  3MB *

Re: mod_perl virtual web hosting

2000-04-15 Thread Gunther Birznieks

While I understand that it might be an "advantage" to allow the customer's 
their own mix of modules, it can also be a bit of a support headache as 
different customers will be loading different DSOs presumably even in 
different orders. There may be subtle bugs with module interaction that 
providing a standard set of one or two Apache binaries would solve for the ISP.

However, I do agree that you should not compile in mod_perl and other 
modules like it by default for the front-end servers The reason is 
security. Core Apache is complicated enough without adding to the potential 
for buffer overflows.  The more lines of code you add, the greater the 
likelihood of bugs, some of those bugs have a likelihood of being security 
bugs.

By only having a subset of users using a DSO, or making a back-end server 
use this stuff, you could limit the possibility that the main web server 
itself is subject to the overflow (although this is still a risky scenario, 
it is less risky than compiling everything in one monolithic apache) since 
the front-end could be designed to have very little in it except for an 
Intrusion Detection system (which I am under the impression that someone is 
writing?).

Anyway, I guess it all boils down to balance and the risks (either support 
or security) you are willing to take as an ISP. Adding mod_perl certainly 
is a risk in itself! But one that I think many of us definatley would not 
mind paying extra for as a dedicated server solution is extremely expensive 
as it is.

Later,
Gunther

At 04:26 PM 4/12/00 +, Jesse Wolfe wrote:
>At 02:54 PM 4/12/00 -0700, Tom Brown wrote:
> >On Wed, 12 Apr 2000, Jesse Wolfe wrote:
> >
> >> I am working with www.superb.net to get their mod_perl up and working
> >> again. They have great infrastrucure, lots of great tools, and an amazing
> >> price.
> >> They had apache/mod_perl for awhile, and upgrades broke it.  I expect they
> >> will have it in a week or two, if we can use all these dynamic/shared
> >> modules as planned.
> >
> >strikes me (as an owner of a web hosting service) that DSO is the wrong
> >answer. What does DSO buy you? NOTHING except a complete waste of
> >memory...
>
>well, it would let each customer add their own combo of modules to their
>apache server without requiring either two installs (one with, one without
>mod_perl) or making everyone's server run mod_perl even if they aren't
>using it.
>
>But it is a good question... I found it kind of unusual to request that
>EVERY Apache module be dso, including mod_perl, mod_ssl, PHP3, and libdav.
>I'm not even sure that it's possible that it will run well. Any advise
>anyone has here would be most appreciated.
>
>I'm actually kind of surprised I got mod_perl DSO 'make test' to pass at all.
>
> >
> >I'm reading between the lines here, but it sounds like you are trying to
> >have _one_ parent apache daemon that services _everything_ on the machine
> >(likely _more_ than one website), which would imply that you are going to
> >have an _extremely_ low hit ratio on your mod_perl scripts.
>
>nahh, that's not where we were going with it. I am pretty sure it's just a
>"maximum flexibility" feature they want to have on hand to minimize tech
>support, etc.  Why does DSO waste so much memory? I thought DSO would mean
>all processes share the resident copy of the perl library?
>
> >
> >it strikes me that you _want_ a frontend proxy to feed your requests to
> >the smallest number of backend daemons which are most likely to already
> >have compiled your scripts. This saves memory and CPU, while simplifying
> >the configuration, and of course, for a dedicated backend daemon, DSO buys
> >nothing... even if that daemon uses access handlers, it still always needs
> >mod_perl
>
>remember we're talking an entire ISP, not just a website. I think it might
>be a mighty pain to have everyone running or sharing some backend mod_perl
>server. Logs and all that.
>
> >
> >That said, we bought modperl-space.com back when domains suddenly got
> >cheap, but haven't put together a mod_perl package because we really don't
> >know what folks want/are using it for.
>
>seems like most folks with enough ?? to be using mod_perl are either
>working corporate or have their own hosts and don't have to deal with ISP's
>on the mod_perl issue.  With all this talk about mod_perl programmers being
>needed, I'm surprised to find so few ISP's offering it.  Bottom line it
>seems you need to be able to keep a mod_perl guru on hand just to keep it
>going... the compilation, etc. of it is so complex. I actually need to
>leave my original mod_perl isp because they just don't have the staff to
>make the right mods to their system, even if I figure it out for them.
>Their mod_perl person got up and left and they can't seem to hire a
>replacement. Anyone in Boston area would like to work with a pretty decent
>ISP?
>
>Regards,
>
>Jesse
>
>




Re: modperl and MIME::Parser?

2000-04-15 Thread John S. Evans

So digging a little deeper (and through the magic of trial and error), the
offending module seems to be Mail::Field.

It has a bunch of code to dynamically load perl classes for various types of
fields (AddrList, Date, Content-Type, etc), and this code seems to do
something that makes modperl (or apache) very unhappy.

Basically, it seems to blow through the INC list, looking for a directory
that matches Mail::Field.  When it finds such a directory, it blows
recursively through it, using "require" on every file it finds.

Somewhere, this seems to be causing apale or modperl some unhappiness.

Any other clues or pointers?  I'm half tempted at this point to rewrite
things as a CGI rather than a module, because MIME::Parser seems to work
from a CGI.

-jse


> From: "John S. Evans" <[EMAIL PROTECTED]>
> Date: Fri, 14 Apr 2000 16:02:20 -0700
> To: modperl <[EMAIL PROTECTED]>
> Cc: Stas Bekman <[EMAIL PROTECTED]>
> Subject: Re: modperl and MIME::Parser?
> 
> I ran struss, but I'm not sure how useful the put is.  I've enclosed it, if
> anyone has time to take a look...
> 
> It does look like there might have been some problem loading some of the
> modules (Mail/Field/addrlist.pm), but I can't find anyone who actually uses
> that module.  But typically when Apache can't load a module, it prints an
> error message for me.  In this case, the apache log gets NO entries it in,
> but the process is still running.
> 
> Unfortunately, I don't have apache compiled with symbols (there were
> stripped), but here is the stack trace while Apache is "hung":
> 
> #0  0xff216f8c in _read () from /usr/lib/libc.so.1
> #1  0xff208094 in _filbuf () from /usr/lib/libc.so.1
> #2  0x11a900 in Perl_sv_gets ()
> #3  0xe02a0 in Perl_filter_read ()
> #4  0xee004 in Perl_pmflag ()
> #5  0xe37a8 in Perl_yylex ()
> #6  0xefa90 in Perl_yyparse ()
> #7  0x135f08 in Perl_sv_compile_2op ()
> #8  0x136ce0 in Perl_pp_require ()
> #9  0x14d508 in Perl_runops_standard ()
> #10 0xd5b14 in perl_eval_sv ()
> #11 0xd5f04 in perl_require_pv ()
> #12 0x39be0 in perl_reload_inc ()
> #13 0x310f0 in perl_restart ()
> #14 0x317b4 in perl_startup ()
> #15 0x31524 in perl_module_init ()
> #16 0x77188 in ap_init_modules ()
> #17 0x8500c in ap_child_terminate ()
> #18 0x85d14 in main ()
> 
> It looks like it may be blocked in a read somewhere.  Very strange.
> 
> -jse
> 
> 
>> From: Stas Bekman <[EMAIL PROTECTED]>
>> Date: Fri, 14 Apr 2000 23:20:17 +0300 (IDT)
>> To: "John S. Evans" <[EMAIL PROTECTED]>
>> Cc: modperl <[EMAIL PROTECTED]>
>> Subject: Re: modperl and MIME::Parser?
>> 
>> On Fri, 14 Apr 2000, John S. Evans wrote:
>> 
>>> So I'm trying to work around my problems with Apache::Request by parsing the
>>> request myself.  This way I can work with only a single open file at a time.
>>> 
>>> But now I have a new problem.  In my content handler, if I add "use
>>> MIME::Parser;" to the top of my file, apache won't start - it seems to hang
>>> during the startup phase.  My module never gets initialized, and apache
>>> never prints "[Fri Apr 14 12:29:14 2000] [notice] Apache/1.3.9 (Unix)
>>> mod_perl/1.21 configured -- resuming normal operations".
>>> 
>>> I don't even call any functions in the module yet, just "use" it.  If I take
>>> out the "use" statement, apache (and my module) load just fine.
>>> 
>>> Can you think of anything that would cause the MIME::Parser module to
>>> disagree with modperl?  I'm just about to start reaming through the
>>> MIME::Parser source code looking for stuff that gets initialized at
>>> module-load time.
>> 
>> I've no idea about the cause, but why don't you start the server under
>> strace (or truss) and see where it hangs. If you don't figure out by
>> yourself send the trace to the list (the relevant snippet if you know
>> what's relevant).
>> 
>> See:
>> http://perl.apache.org/guide/debug.html#Determination_of_the_reason
>> http://perl.apache.org/guide/debug.html#Debug_Tracing
>> 
>> __
>> Stas Bekman | JAm_pH--Just Another mod_perl Hacker
>> http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide
>> mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
>> http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
>> --
>> 
>> 
> 
> 




Re: Segfault on DBI->Connect

2000-04-15 Thread Jochen Wiedmann



On Tue, 11 Apr 2000, Doug MacEachern wrote:

> On Tue, 4 Apr 2000 [EMAIL PROTECTED] wrote:
> 
> > I've been seeing the same segfault-on-connect problem with Apache 1.2.12
> > + mod_perl 1.22 + DBI 1.13 + Msql-Mysql-modules 1.2211.  The segfault is
> > due to a null first argument being passed to mysql_real_connect().
> > 
> > Running Apache with a -X argument yields the following backtrace when my
> > mod_perl module does a DBI->connect (str, username, passwd, { options }).
> > Note the null mysql argument 
> > |
> > V
> > #0  0x80ef5b7 in mysql_real_connect (mysql=0x0, 
> > host=0x8a99db8 "hostname.brown.edu", user=0x8a9b550 "username", 
> > passwd=0x8a9b568 "password", db=0x8a99e40 "databasename", port=3306, 
> > unix_socket=0x0, client_flag=0) at libmysql.c:1125
> > #1  0x402d01fd in mysql_dr_connect ()
> >from /usr/lib/perl5/site_perl/5.005/i386-linux/auto/DBD/mysql/mysql.so
> > #2  0x402d0540 in _MyLogin ()
> >from /usr/lib/perl5/site_perl/5.005/i386-linux/auto/DBD/mysql/mysql.so
> > 
> > The mysql_real_connect routine does a set_sigpipe(mysql), which triggers
> > the segfault.

If you are using the DBD-mysql sources, as distributed by me, the
mysql_real_connect function will *never* be called with a NULL
argument. This cannot happen, if mysql_init() is called before
mysql_real_connect(). (Unless you are using some patches that I have
recently reached me from the mod_perl mailing list, but that I haven't
verified yet.) Calling mysql_real_connect with mysql==NULL will surely
cause a SEGFAULT.

So my question remains: Why the heck *are* you passing a NULL argument?
Is this because of some bug in the DBD::mysql driver or are you using
modified sources?

Btw, Doug, as I see the sigpipe thing: What do you recommend for the
DBD::mysql driver? (Remember the "MySQL morning bug"?) Should we
enable or disable SIGPIPE?


Thanks,

Jochen





Re: modperl and MIME::Parser?

2000-04-15 Thread John S. Evans

I ran struss, but I'm not sure how useful the put is.  I've enclosed it, if
anyone has time to take a look...

It does look like there might have been some problem loading some of the
modules (Mail/Field/addrlist.pm), but I can't find anyone who actually uses
that module.  But typically when Apache can't load a module, it prints an
error message for me.  In this case, the apache log gets NO entries it in,
but the process is still running.

Unfortunately, I don't have apache compiled with symbols (there were
stripped), but here is the stack trace while Apache is "hung":

#0  0xff216f8c in _read () from /usr/lib/libc.so.1
#1  0xff208094 in _filbuf () from /usr/lib/libc.so.1
#2  0x11a900 in Perl_sv_gets ()
#3  0xe02a0 in Perl_filter_read ()
#4  0xee004 in Perl_pmflag ()
#5  0xe37a8 in Perl_yylex ()
#6  0xefa90 in Perl_yyparse ()
#7  0x135f08 in Perl_sv_compile_2op ()
#8  0x136ce0 in Perl_pp_require ()
#9  0x14d508 in Perl_runops_standard ()
#10 0xd5b14 in perl_eval_sv ()
#11 0xd5f04 in perl_require_pv ()
#12 0x39be0 in perl_reload_inc ()
#13 0x310f0 in perl_restart ()
#14 0x317b4 in perl_startup ()
#15 0x31524 in perl_module_init ()
#16 0x77188 in ap_init_modules ()
#17 0x8500c in ap_child_terminate ()
#18 0x85d14 in main ()

It looks like it may be blocked in a read somewhere.  Very strange.

-jse


> From: Stas Bekman <[EMAIL PROTECTED]>
> Date: Fri, 14 Apr 2000 23:20:17 +0300 (IDT)
> To: "John S. Evans" <[EMAIL PROTECTED]>
> Cc: modperl <[EMAIL PROTECTED]>
> Subject: Re: modperl and MIME::Parser?
> 
> On Fri, 14 Apr 2000, John S. Evans wrote:
> 
>> So I'm trying to work around my problems with Apache::Request by parsing the
>> request myself.  This way I can work with only a single open file at a time.
>> 
>> But now I have a new problem.  In my content handler, if I add "use
>> MIME::Parser;" to the top of my file, apache won't start - it seems to hang
>> during the startup phase.  My module never gets initialized, and apache
>> never prints "[Fri Apr 14 12:29:14 2000] [notice] Apache/1.3.9 (Unix)
>> mod_perl/1.21 configured -- resuming normal operations".
>> 
>> I don't even call any functions in the module yet, just "use" it.  If I take
>> out the "use" statement, apache (and my module) load just fine.
>> 
>> Can you think of anything that would cause the MIME::Parser module to
>> disagree with modperl?  I'm just about to start reaming through the
>> MIME::Parser source code looking for stuff that gets initialized at
>> module-load time.
> 
> I've no idea about the cause, but why don't you start the server under
> strace (or truss) and see where it hangs. If you don't figure out by
> yourself send the trace to the list (the relevant snippet if you know
> what's relevant).
> 
> See:
> http://perl.apache.org/guide/debug.html#Determination_of_the_reason
> http://perl.apache.org/guide/debug.html#Debug_Tracing
> 
> __
> Stas Bekman | JAm_pH--Just Another mod_perl Hacker
> http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide
> mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
> http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
> --
> 
> 


 truss.out


RE: Apache::ASP problem, example index.html not working

2000-04-15 Thread Andy Yiu

Hi, this is Andy again.

It's about that, after I installed the ASP patch, all
other *.asp are working but the index.html is not
which claim that it couldn't find glocal.asa

The .htaccess file I used is from your example folder,
which is :


# Note this file was used for Apache 1.3.0
# Please see the readme, for what exactly the config
variables do.

PerlSetVar Global  .
PerlSetVar GlobalPackage Apache::ASP::Demo
PerlSetVar StateDir  /tmp/asp_demo
PerlSetVar StatINC 0
PerlSetVar StatINCMatch 0
PerlSetVar Clean 0
PerlSetVar DynamicIncludes 1
PerlSetVar FileUploadMax 25000
PerlSetVar FileUploadTemp 1
PerlSetVar SessionQueryParse 0
PerlSetVar SessionQuery 1
PerlSetVar Debug -2
PerlSetVar StateCache 0

# .asp files for Session state enabled

SetHandler perl-script
PerlHandler Apache::ASP
PerlSetVar CookiePath  /
PerlSetVar SessionTimeout  .5
#   PerlSetVar StateSerializer Storable
#   PerlSetVar StateDB DB_File
#   PerlSetVar StatScripts 0


# .htm files for the ASP parsing, but not the $Session
object
# NoState turns off $Session & $Application

SetHandler perl-script
PerlHandler Apache::ASP
PerlSetVar NoState 1 
PerlSetVar BufferingOn 1
PerlSetVar NoCache 1
PerlSetVar DebugBufferLength 250



ForceType text/plain


# .ssi for full ssi support, with Apache::Filter

SetHandler perl-script
PerlHandler Apache::ASP Apache::SSI
PerlSetVar Global .
PerlSetVar Filter On
PerlSetVar NoCache 1



SetHandler perl-script
PerlHandler Apache::ASP
PerlSetVar CookiePath  /
PerlSetVar SessionTimeout  1
PerlSetVar SessionQueryParseMatch
^http://localhost

-

the folder I put my asp example files is
/data/home/asp/eg

And here is the error message I get.

---
Errors Output

> %EG is not defined, make sure you copied
./eg/global.asa correctly at (eval 12) line 5.
, /usr/lib/perl5/site_perl/5.005/Apache/ASP.pm line
1229

Debug Output

> RUN ASP (v0.18) for /data/home/asp/eg/index.html
> GlobalASA package Apache::ASP::Demo
> ASP object created - GlobalASA:
Apache::ASP::GlobalASA=HASH(0x83c5370); Request:
Apache::ASP::Request=HASH(0x83947d0); Response:
Apache::ASP::Response=HASH(0x83946d4); Server:
Apache::ASP::Server=HASH(0x83945a8); basename:
index.html; compile_includes: 1; dbg: 2;
debugs_output: ARRAY(0x82834c0); filename:
/data/home/asp/eg/index.html; global: /tmp;
global_package: Apache::ASP::Demo; id: NoCache;
includes_dir: ; init_packages: ARRAY(0x8302fe4);
no_cache: 1; no_state: 1; package: Apache::ASP::Demo;
pod_comments: 1; r: Apache=SCALAR(0x840ca70);
sig_warn: ; stat_inc: ; stat_inc_match: ;
stat_scripts: 1; unique_packages: ; use_strict: ; 
> parsing index.html
> runtime exec of dynamic include header.inc args ()
> parsing header.inc
> undefing sub Apache::ASP::Demo::_tmp_header_inc code
CODE(0x840ff20)
> compile include header.inc sub _tmp_header_inc
> runtime exec of dynamic include footer.inc args ()
> parsing footer.inc
> already failed to load Apache::Symbol
> undefing sub Apache::ASP::Demo::_tmp_footer_inc code
CODE(0x842b6a0)
> compile include footer.inc sub _tmp_footer_inc
> already failed to load Apache::Symbol
> undefing sub Apache::ASP::Demo::NoCache code
CODE(0x842b6b8)
> compiling into package Apache::ASP::Demo subid
Apache::ASP::Demo::NoCache
> executing NoCache
> %EG is not defined, make sure you copied
./eg/global.asa correctly at (eval 12) line 5.
, /usr/lib/perl5/site_perl/5.005/Apache/ASP.pm line
1229

ASP to Perl Program

  1: package Apache::ASP::Demo; ;; sub
Apache::ASP::Demo::NoCache {  ;;  return(1) unless
$_[0];  ;; no strict;;use vars qw($Application
$Session $Response $Server $Request);;
  2: # split the page in 2 for nice formatting and
english style sorting
  3: my(@col1, @col2);
  4: my @keys = sort keys %EG;
  5: @keys || die("\%EG is not defined, make sure you
copied ./eg/global.asa correctly");
  6: my $half = int(@keys/2);
  7: 
  8: for(my $i =0; $i <= $#keys; $i++) {
  9:if($i < $half) {
 10:push(@col1, $keys[$i]);
 11:} else {
 12:push(@col2, $keys[$i]);
 13:}
 14: } 
 15: $Response->Debug(\@col1, \@col2);
 16: $title = 'Example ASP Scripts';
 17: $Response->Write('
 18: 
 19: '); $Response->Include('header.inc', );
$Response->Write('
 20: 
 21: 
 22: '); while(@col1) { 
 23:my $col1 = shift @col1;
 24:my $col2 = shift @col2;
 25:$Response->Write('
 26:
 27:'); for([$col1, $EG{$col1}], '', [$col2,
$EG{$col2}]) { 
 28:unless(ref $_) { 
 29:print " ";
 30:next;
 31:} 
 32:next unless $_->[0]; # last col / last
row
 33: 
 34:# clean up the descriptions
 35:$_->[1] =~ s/\s*\.\s*$//s;
 36:$_->[1] .= '.';
 37: 
 38:   

how to disable/control mod_perl SSI arg encoding

2000-04-15 Thread S Page

Hey folks, my first post.

I'm writing an Apache SSI as a mod_perl module.  Apache 1.3.4 / mod_perl
1.18.
When Web designers pass it HTML character entities, by the time they
arrive at my module some become 8-bit characters.

For example,


If I `print STDERR` or `$r->print` out args, it contains
header_text=Gen \351 erat \364 r reviews and Oscar® awards.  The
accented characters are encoded as 0xe9 0xf4.

*   What Apache/mod_perl code is doing the entity decoding?
*   What character encoding is it using?   It looks like ISO-8859-1.
*   Is there a way to disable this or control it myself?

Thanks, and many many thanks for all the code and the commentary.
--
=S Page,   Macromedia Web engineer
  "Information is deeper than reality" 
  http://www.newscientist.com/ns/980314/features1.html




Re: Modperl/Apache deficiencies... Memory usage.y

2000-04-15 Thread Leslie Mikesell

According to [EMAIL PROTECTED]:

> Does anyone know of any program which has been developed like this?
> Basically we'd be turning the "module of apache" portion of mod_perl
> into a front end to the "application server" portion of mod_perl that
> would do the actual processing.

This is basically what you get with the 'two-apache' mode.

> It seems quite logical that something
> like this would have been developed, but possibly not.  The seperation
> of the two components seems like it should be done, but there must be
> a reason why no one has done it yet... I'm afraid this reason would be
> the apache module API doesn't lend itself to this.

The reason it hasn't been done in a threaded model is that perl
isn't stable running threaded yet, and based on the history
of making programs thread-safe, I'd expect this to take at
least a few more years.  But, using a non-mod-perl front
end proxy with ProxyPass and RewriteRule directives to hand
off to a mod_perl backend will likely get you a 10-1 reduction
in backend processes and you already know the configuration
syntax for the second instance.

 Les Mikesell
   [EMAIL PROTECTED]



Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Tom Mornini

On Sat, 15 Apr 2000 [EMAIL PROTECTED] wrote:

> This has become an issue for one of the
> companies that I work for, and I noted from monitoring the list that
> some people have apache processes that are upwards of 25Megs, which is
> frankly ridiculous.

1) I've seen them bigger than 25 megs.

2) Do you know about the front-end proxy/back-end mod_perl configuration?

-- Tom Mornini
-- InfoMania Printing and Prepress




Re: mod_perl virtual web hosting

2000-04-15 Thread Mike Lambert

Yes, I am an Infoboard customer. However, we aren't really involved in just
small sites. Our site has over 100 cgi scripts, with two dozen modules being
used by them (yeah, we'll be moving to Apache::Registry in the next version
:). I've never had the need for SSI or anything else besides "simple"
mod_perl, since it does all that we want. But I can see now that since they
have a single apache build, that you can't get custom stuff in, and can't do
extra stuff like you want with mod_ssi. I dunno why you have to run a proxy
with them, tho. You're limited to what can be changed in a httpd.conf file.
Oh well.

Mike Lambert

- Original Message -
From: "Jesse Wolfe" <[EMAIL PROTECTED]>
To: "Mike Lambert" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Thursday, April 13, 2000 2:57 PM
Subject: Re: mod_perl virtual web hosting


> At 10:21 PM 4/12/00 -0400, Mike Lambert wrote:
> >This is my first post on the list, hopefully it's helpful. ;)
> >
> >We've had great success with InfoBoard. We have four mod_perl accounts
set
> >up with them, and we are currently moving to a colocated server that they
> >are hosting for us. They have good experience with mod_perl, and can
easily
> >get you up and running with your own apache server. They have a toll
phone
> >support until 6 or 7 est, but are not open on weekends. When they are
there
> >however, they have a very good response time and user support. (And an
> >extra-cost option for paging them in emergencies, which we've used a few
> >times ;) They run an independant server for each client, so you won't
have
> >any interference with any other mod_perl clients.
> >
> >http://www.infoboard.com/
> >
> >Mike Lambert
>
> Mike,
>
> Very interesting. I am also an infoboard customer, but have not had such
> luck as you describe.  I had a pretty decent setup, then they changed
their
> configuration unannounced, broke my site, then set me up in such a way
that
> I *have* to run proxy... their standard mod_perl/apache setup doesn't
> handle SSI's but they don't have any staff to look at the problem, or so
> the owner tells me.
>
> I suppose if you have just a few scripts to run mod_perl their setup may
be
> adequate... but if you are working on a larger project such as I am you
may
> find yourself without a lot of support.
>
> good luck...
>
> Jesse
>
>
> >
> >- Original Message -
> >From: "Gagan Prakash" <[EMAIL PROTECTED]>
> >To: <[EMAIL PROTECTED]>
> >Sent: Wednesday, April 12, 2000 1:26 PM
> >Subject: mod_perl virtual web hosting
> >
> >
> >> Hello,
> >>
> >> I have been looking for mod_perl virtual web hosting companies who have
> >fast
> >> servers and good infrastructure but the two I have found so far have
> >either
> >> had problems with their mod_perl setups (they installed the module, did
> >not
> >> change apache configs or changed them incorrectly) or have been very
slow.
> >> These two are www.123hostme.com or www.olm.net.
> >>
> >> I would greatly appreciate if somebody could point me in a better
> >direction.
> >>
> >> Thanks
> >> Gagan
> >>
> >>
> >> ** Web App Development  Needs? ***
> >>   Contact OSATech today! http://www.OSATech.com
> >> ***
> >>
> >> - Original Message -
> >> From: "Jason Murphy" <[EMAIL PROTECTED]>
> >> To: "Doug MacEachern" <[EMAIL PROTECTED]>
> >> Cc: <[EMAIL PROTECTED]>
> >> Sent: Wednesday, April 12, 2000 1:09 PM
> >> Subject: Re: $r->args troubles...
> >>
> >>
> >> >
> >> > You would have guessed right. However, the problem was two fold in my
> >> case.
> >> >
> >> > First, I was not calling Apache::Request correctly. The proper method
to
> >> > call Apache was told to me by Doug Kyle (Giving credit where due!).
> >Below
> >> is
> >> > how it is done.
> >> >
> >> > <--- Begin Example
> >> >
> >> > my $r = Apache->request;
> >> > my $apr = Apache::Request->new($r);
> >> >
> >> > my %params = $apr->args;
> >> >
> >> > print $params{"Player"};
> >> >
> >> > < End Example
> >> >
> >> > The 'print $params{"Player"}' would be used to get and print
something
> >> like
> >> > the parameters from the URL of a GET like
> >> > "www.example.com/find_player.pl?Player=Mullen" (Not a real site, dont
> >> > click!).
> >> >
> >> > Second part of my problem was that I had an error in my
Apache::Registry
> >> > setup in Apache.conf or perl.conf (Can't remember where I put it).
The
> >> > script I was running was not being picked up by Apache::Registry and
> >thus
> >> > not working.
> >> >
> >> >  Thanks for everyone's help.
> >> >
> >> > PS. The only reason I say this on the mailing list is to get it in to
> >the
> >> > mailing list archives because I could not my solution there when I
> >looked.
> >> >
> >> >
> >> > From: "Doug MacEachern" <[EMAIL PROTECTED]>
> >> > To: "Jason Murphy" <[EMAIL PROTECTED]>
> >> > Cc: <[EMAIL PROTECTED]>
> >> > Sent: Tuesday, April 11, 2000 8:52 PM
> >> > Subject: Re: $r->args troubles...
> >> >
> >> > > On Fri, 7 Apr 200

Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Perrin Harkins

> Each process of apache has
> it's registry which holds the compiled perl scripts in..., a copy of
> each for each process.  This has become an issue for one of the
> companies that I work for, and I noted from monitoring the list that
> some people have apache processes that are upwards of 25Megs, which is
> frankly ridiculous.

I have processes that large, but more than 50% of that is shared through
copy-on-write.

> I wrote a very small perl engine
> for phhttpd that worked within it's threaded paradigm that sucked up a
> neglibible amount of memory which used a very basic version of
> Apache's registry.

Can you explain how this uses less memory than mod_perl doing the same
thing?  Was it just that you were using fewer perl interpreters?  If so, you
need to improve your use of apache with a multi-server setup.  The only way
I could see phttpd really using less memory to do the same work is if you
somehow managed to get perl to share more of its internals in memory.  Did
you?

> What I'm
> thinking is essentially we take the perl engine which has the apache
> registry and all the perl symbols etc., and seperate it into it's own
> process which would could be multithreaded (via pthreads) for multiple
> processor boxes.  (above 2 this would be beneficial probably)  On the
> front side the apache module API would just connect into this other
> process via shared memory pages (shmget et. al), or Unix pipes or
> something like that.

This is how FastCGI, and all the Java servlet runners (JServ, Resin, etc.)
work.  The thing is, even if you run the perl interpreters in a
multi-threaded process, it still needs one interpreter per perl thread and I
don't know how much you'd be able to share between them.  It might not be
any smaller at all.

My suggestion would be to look at the two-server approach for mod_perl, and
if that doesn't work for you look at FastCGI, and if that doesn't work for
you join the effort to get mod_perl working on Apache 2.0 with a
multi-threaded model.  Or just skip the preliminaries and go straight for
the hack value...

- Perrin




[summary] Re: front end proxy and virtual hosts

2000-04-15 Thread Stas Bekman


=head1 Front-end Back-end Proxying with Virtual Hosts

This section explains a configuration setup for proxying your back-end
mod_perl servers when you need to use Virtual Hosts.

The approach is to use unique port number for each virtual host at the
back-end server, so you can redirect from the front-end server to
localhost::1234, and name-based virtual servers on the front end, though
any technique on the front-end will do. 

If you run the front-end and the back-end servers on the same machine
you can prevent any direct outside connections to the back-end server
if you bind tightly to address C<127.0.0.1> (I) as you will
see in the following configuration example.

The front-end (light) server configuration:

  
ServerName www.example.com
ServerAlias example.com
RewriteEngine On
RewriteOptions 'inherit'
RewriteRule \.(gif|jpg|png|txt)$ - [last]
RewriteRule ^/(.*)$ http://localhost:4077/$1 [proxy]
  

  
ServerName foo.example.com
RewriteEngine On
RewriteOptions 'inherit'
RewriteRule \.(gif|jpg|png|txt)$ - [last]
RewriteRule ^/(.*)$ http://localhost:4078/$1 [proxy]
  

The above front-end configuration handles two virtual hosts:
I and I. The two setups are almost
identical.

The front-end server will handle files with the extensions I<.gif>,
I<.jpg>, I<.png> and I<.txt> internally, the rest will be proxified to
be handled by the back-end server.

The only difference between the two virtual hosts settings is that the
former rewrites requests to the port C<4077> at the back-end machine
and the latter to the port C<4078>.

The back-end (heavy) server configuration:

  Port 80
  
  PerlPostReadRequestHandler My::ProxyRemoteAddr
  
  Listen 4077
  
ServerName www.example.com
Port 80
DirectoryIndex index.shtml index.html
  
  
  Listen 4078
  
ServerName foo.example.com
Port 80
DirectoryIndex index.shtml index.html
  

The back-end server knows to tell which virtual host the request is
made to, by checking the port number the request was proxified to and
using the appropriate virtual host section to handle it.

We set S<"Port 80"> so that any redirects don't get sent directly to
the back-end port.

To get the I remote IP addresses from proxy, the
L
handler is used based on the C Apache module.
Prior to mod_perl 1.22+ this setting must have been set per-virtual
host, since it wasn't inherited by the virtual hosts.

The following configuration is yet another useful example showing the
other way around. It specifies what to be proxified and than the rest
is served by the front end:

  RewriteEngine on
  RewriteLogLevel   0
  RewriteRule   ^/(perl.*)$  http://127.0.0.1:8052/$1   [P,L]
  RewriteRule   ^proxy:.*   - [F]
  ProxyRequests on
  NoCache   *
  ProxyPassReverse  /  http://www.example.com/

So we don't have to specify the rule for the static object to be
served by the front-end as we did in the previous example to handle
files with the extensions I<.gif>, I<.jpg>, I<.png> and I<.txt>
internally.



__
Stas Bekman | JAm_pH--Just Another mod_perl Hacker
http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
--





Re: front end proxy and virtual hosts

2000-04-15 Thread Matt Carothers



On Mon, 10 Apr 2000, Eric Cholet wrote:

> The front-end light server, serving static requests and proxying
> dynamic requests to a back-end modperl server, is well documented,
> except in the case of virtual hosts. How do you do it?

On the front end:


DocumentRoot /vhosts/customer
ProxyPass/perl/ http://localhost/customer/perl/
ProxyPassReverse /perl/ http://localhost/customer/perl/


On the back end:

DocumentRoot /vhosts
BindAddress 127.0.0.1


SetHandler perl-script
PerlHandler Apache::Registry # Or whatever
PerlSendHeader  On
Options +ExecCGI


- Matt




Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Ken Williams

[EMAIL PROTECTED] (Robert Monical) wrote:
>I am not very knowledgeable but have been lurking on this list for a
>couple of months. The last week (or two) have seen a number of posts
>about having two Apaches. A light weight front end and a mod-perl
>enabled back end.  Since I do not fully understand what folks are
>talking about, these all go in an archive for me to read later. 

Start here:

 http://perl.apache.org/guide/strategy.html#Alternative_architectures_for_ru






RE: [RFC] Transitioning from Apache::Registry to Apache handlers

2000-04-15 Thread Stas Bekman

On Fri, 14 Apr 2000, Chris Nokleberg wrote:

> 
> > Someone has asked how to move from registry scripts to perl handlers, this
> > is my attempt to show in details the process. Comments are welcome.
> 
> In my mind, one of the biggest problems in transitioning from
> Apache::Registry is the added server configuration complexity. Would it be
> possible to add a section on the best way to simplify or eliminate the need
> to modify the conf file for each new handler?  sections, etc.?

Of course, anyone has a sample  section handy? I'm still an
C fan, to be changed soon :)


__
Stas Bekman | JAm_pH--Just Another mod_perl Hacker
http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
--




Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Robert Monical

I am not very knowledgeable but have been lurking on this list for a couple 
of months.
The last week (or two) have seen a number of posts about having two 
Apaches. A light weight front end and a mod-perl enabled back end.  Since I 
do not fully understand what folks are talking about, these all go in an 
archive for me to read later. In the application that I inherited, a front 
ends feeds into a  modperl engine for the database portion of the session. 
All static downloads go to another instance, in our case, on separate hardware.

What are your thoughts about that approach as opposed to your idea?


At 12:03 AM 4/15/00, [EMAIL PROTECTED] wrote:
>Modperlers...,
>
>I'd like to start a discussion about the deficiences in Apache/modperl
>and get you feedback with regard to this issue.  The problem as I see
>it is that the process model that Apache uses is very hard on modperl.
>It is very memory inneffecient basically.  Each process of apache has
>it's registry which holds the compiled perl scripts in..., a copy of
>each for each process.  This has become an issue for one of the
>companies that I work for, and I noted from monitoring the list that
>some people have apache processes that are upwards of 25Megs, which is
>frankly ridiculous.
>
>This is not meant to be a flame, and I'd really like to get down to
>the nitty gritty on how we can solve this problem.  Zach Brown wrote
>phhttpd which is a threaded server which can handle a lot more load
>than apache, but the problem is it doesn't have the features that
>Apache has, and it's not going to catch up any time soon so I think
>it's not going to be the cure-all.  I wrote a very small perl engine
>for phhttpd that worked within it's threaded paradigm that sucked up a
>neglibible amount of memory which used a very basic version of
>Apache's registry.  Once again though it didn't have the feature set
>that Apache/modperl has.  Now to address the issue: I think we have a lot of
>code in Modperl that is basically awesome.  Not to mention the fact
>that Apache itself has a lot of modules and other things which are
>quite usefull.  However I think if it were possible to divorce the
>actually perl engine from the Apache process we could solve this memory
>usage problem.
>
>Basically heres what I'm thinking might be possible, but if it's not
>just let me know.  (Well, I know it's possible, but I mean how much
>work would it take to institute, and has someone else worked on this,
>or taken a look at how much work we'd be talking about)  What I'm
>thinking is essentially we take the perl engine which has the apache
>registry and all the perl symbols etc., and seperate it into it's own
>process which would could be multithreaded (via pthreads) for multiple
>processor boxes.  (above 2 this would be beneficial probably)  On the
>front side the apache module API would just connect into this other
>process via shared memory pages (shmget et. al), or Unix pipes or
>something like that.  The mod_perl process would have a work queue
>that the Apache processes could add work to via our front end API.
>The work threads inside of that mod_perl process would take work
>"orders" out of the work queue and process them and send the result
>back to the waiting apache process.  (Maybe just something as simple
>as a blocking read on a pipe coming out of the mod_perl process...
>this would keep down context switching issues and other nasty bits)
>
>One of my concerns is that maybe the apache module API is simply too
>complex to pull something like this off.  I don't know, but it seems
>like it should be able to handle something like this.
>
>Does anyone know of any program which has been developed like this?
>Basically we'd be turning the "module of apache" portion of mod_perl
>into a front end to the "application server" portion of mod_perl that
>would do the actual processing.  It seems quite logical that something
>like this would have been developed, but possibly not.  The seperation
>of the two components seems like it should be done, but there must be
>a reason why no one has done it yet... I'm afraid this reason would be
>the apache module API doesn't lend itself to this.
>
>Well, thanks to everyone in advance for their thoughts/comments...
>Shane Nay.


Have a great day!

--Robert Monical
--Director of CRM Development
[EMAIL PROTECTED]


"The Truth is Out There"




ANNOUNCE: Apache::AuthCookie 2.007

2000-04-15 Thread Ken Williams

The URL

   
http://forum.swarthmore.edu/~ken/modules/archive/Apache-AuthCookie-2.007.tar.gz

has entered CPAN as

  file: $CPAN/authors/id/KWILLIAMS/Apache-AuthCookie-2.007.tar.gz
  size: 16364 bytes
   md5: 1a2a45007123e8583467668297ebd767


Version: 2.007  Date: 2000/04/15 15:27:02
   If the browser sends a cookie but it's not one related to our
   authentication, we formerly sent a blank cookie to the authentication
   methods.  Now we act as if no cookie was sent.
   [[EMAIL PROTECTED] (Alan Sparks)]
   
   Fixed a server error that occurred when a certain user was required,
   but a different valid user was logged in.
   [[EMAIL PROTECTED] (Eduardo Fujii)]
   
   Added a couple more debug statements that can help figure out what's
   happening when your auth isn't working.
   
   Improved some of the docs.
   
   Added some tricks to Makefile.PL to make my life easier.
   
   Changed the action of the example login forms from LOGIN to /LOGIN.
   [[EMAIL PROTECTED] (Michael)]



  ------
  Ken Williams Last Bastion of Euclidity
  [EMAIL PROTECTED]The Math Forum





[correction] Benchmarking Apache::Registry and Perl Content Handler

2000-04-15 Thread Stas Bekman

Well, following the guardian Gunther suggestions I've tried to rerun these
tests as well, preloading the script to make the benchmark fair. This has
improved the results significantly, shortening the gaps. 

But read the both sets of tests and try to explain the phenomena that you
will see. You will find it at the end, under the [ReaderMETA] tag. Thanks!




=head1 Benchmarking Apache::Registry and Perl Content Handler

=head2 Light (Empty) Code

First lets see the overhead that Apache::Regsitry adds. In order to do
that we will use an almost empty scripts, that only send a basic
header and one word as a content.

The I script running under C:

  benchmarks/registry.pl
  --
  use strict;
  print "Content-type: text/plain\r\n\r\n";
  print "Hello";

The Perl Content handler:

  Benchmark/Handler.pm
  
  package Benchmark::Handler;
  use Apache::Constants qw(:common);
  
  sub handler{
$r = shift;
$r->send_http_header('text/html');
$r->print("Hello");
return OK;
  }
  1;

with settings:

  PerlModule Benchmark::Handler
  
SetHandler perl-script
PerlHandler Benchmark::Handler
  

so we get C preloaded.

We will use the C to preload the script as
well, so the benchmark will be fair and only the processing time will
be measured. In the I we add:

  use Apache::RegistryLoader ();
  Apache::RegistryLoader->new->handler(
  "/perl/benchmarks/registry.pl",
   "/home/httpd/perl/benchmarks/registry.pl");

And we if we check the I (
http://localhost/perl-status?rgysubs ), where we see the listing of the
already compiled scripts :

  Apache::ROOT::perl::benchmarks::registry_2epl

So now we can proceed with the benchmark:

  % ab -n 1000 -c 10 http://localhost/perl/benchmarks/registry.pl
  
  Time taken for tests:   16.148 seconds
  Requests per second:61.93
  Connnection Times (ms)
min   avg   max
  Connect:0 2   202
  Processing:81   15860
  Total: 81   160   262

  % ab -n 1000 -c 10 http://localhost/benchmark_handler
  
  Time taken for tests:   5.097 seconds
  Requests per second:196.19
  Connnection Times (ms)
min   avg   max
  Connect:0 0 3
  Processing:4050   237
  Total: 4050   240

So we can see that the average added overhead is about:

  160 - 50 = 110 milli-seconds

per script.

=head2 Heavy Code

Of course this overhead is insignificant when the code itself is
significantly heavier and slower. Let's leave the above code examples
umodified but add some CPU intensive processing operation (it can be
also an IO operation or a database query.)

  my $x = 100;
  my $y = log ($x ** 100)  for (0..1);

  % ab -n 1000 -c 10 http://localhost/perl/benchmarks/registry.pl
  
  Time taken for tests:   82.614 seconds
  Requests per second:12.10
  Connnection Times (ms)
min   avg   max
  Connect:0 3   670
  Processing:   187   819  1211
  Total:187   822  1881

  % ab -n 1000 -c 10  http://localhost/benchmark_handler
  
  Time taken for tests:   15.000 seconds
  Requests per second:66.67
  Connnection Times (ms)
min   avg   max
  Connect:0 2   112
  Processing:22   147   770
  Total: 22   149   882

The processing time delta has grown to 673 milli-seconds (822-149).

[ReaderMETA]: Anyone knows to explain this phenomena? It's not clear
to me why adding the same CPU intesive code to the two handlers under
test, enlarges the delta of the average processing time between the
two handlers. I'd expect to see the same delta (of 110 msec) in this
case, but that's not what's happenning. Any ideas?

Notice that the hardware used in this test is not important since
what's important is the delta (because we are interested in the
comparison and not the absolute values).

The SW that was used: Apache/1.3.10-dev, (Unix) mod_perl/1.21_01-dev,
Perl5.005_03 for i386-linux.

The relevant server configuration:

  MinSpareServers 10
  MaxSpareServers 10
  StartServers 10
  MaxClients 20
  MaxRequestsPerChild 1



__
Stas Bekman | JAm_pH--Just Another mod_perl Hacker
http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
--




Re: Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread Ken Williams

[EMAIL PROTECTED] wrote:
>Modperlers...,
>
>I'd like to start a discussion about the deficiences in Apache/modperl
>and get you feedback with regard to this issue.  The problem as I see
>it is that the process model that Apache uses is very hard on modperl.
>It is very memory inneffecient basically.  Each process of apache has
>it's registry which holds the compiled perl scripts in..., a copy of
>each for each process.  This has become an issue for one of the
>companies that I work for, and I noted from monitoring the list that
>some people have apache processes that are upwards of 25Megs, which is
>frankly ridiculous.


1) Are you preloading the scripts with RegistryLoader?  That puts them in the
parent process, so the memory will be shared by the children.  Each child will
still [seem to] be 25 megs, but the children will have a lot of overlap.

2) Are you using the two-server model, backend & frontend?  It's a must if you
want to make efficient use of your memory.

3) If you're already doing these things and you still aren't satisfied, perhaps
mod_perl isn't for you and you want to look at FastCGI or a similar project. 
I've never had occasion to use it.  You'll no longer have access to the Apache
API, but it sounds like you're not using that anyway.  


  ------
  Ken Williams Last Bastion of Euclidity
  [EMAIL PROTECTED]The Math Forum





[correction] Benchmarking CGI.pm and Apache::Request

2000-04-15 Thread Stas Bekman

Well, Gunther has pointed out that the benchmark has been unfair, since
CGI.pm's methods have not been precompiled. I've also preloaded both
scripts this time to improve numbers and unloaded a system a bit while
running the tests. Here is the corrected version:

=head1 Benchmarking CGI.pm and Apache::Request

Let's write two registry scripts that use C and
C to process the form's input and print it out.

  benchmarks/cgi_pm.pl
  
  use strict;
  use CGI;
  my $q = new CGI;
  print $q->header('text/plain');
  print join "\n", map {"$_ => ".$q->param($_) } $q->param;

  benchmarks/apache_request.pl
  
  use strict;
  use Apache::Request ();
  my $r = Apache->request;
  my $q = Apache::Request->new($r);
  $r->send_http_header('text/plain');
  print join "\n", map {"$_ => ".$q->param($_) } $q->param;

We preload both modules that we ought to benchmark in the
I:

  use Apache::Request ();
  use CGI  qw(-compile :all);

We will preload the both scripts as well:

  use Apache::RegistryLoader ();
  Apache::RegistryLoader->new->handler(
  "/perl/benchmarks/cgi_pm.pl",
   "/home/httpd/perl/benchmarks/cgi_pm.pl");
  Apache::RegistryLoader->new->handler(
  "/perl/benchmarks/apache_request.pl",
   "/home/httpd/perl/benchmarks/apache_request.pl");

Now let's benchmark the two:

  % ab -n 1000 -c 10 \

'http://localhost/perl/benchmarks/cgi_pm.pl?a=b&c=+k+d+d+f&d=asf&as=+1+2+3+4'
  
  Time taken for tests:   23.950 seconds
  Requests per second:41.75
  Connnection Times (ms)
min   avg   max
  Connect:0 045
  Processing:   204   238   274
  Total:204   238   319

  % ab -n 1000 -c 10 \

'http://localhost/perl/benchmarks/apache_request.pl?a=b&c=+k+d+d+f&d=asf&as=+1+2+3+4'
  
  Time taken for tests:   18.406 seconds
  Requests per second:54.33
  Connnection Times (ms)
min   avg   max
  Connect:0 032
  Processing:   156   183   202
  Total:156   183   234

Apparently the latter script using C is about 23%
faster. If the input is going to be larger the speed up in per cents
grows as well.

Again this benchmark shows that the real timing of the input
processing, when the script is much heavier the overhead of using
C can be insignificant.

__
Stas Bekman | JAm_pH--Just Another mod_perl Hacker
http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
--




Modperl/Apache deficiencies... Memory usage.

2000-04-15 Thread shane

Modperlers...,

I'd like to start a discussion about the deficiences in Apache/modperl
and get you feedback with regard to this issue.  The problem as I see
it is that the process model that Apache uses is very hard on modperl.
It is very memory inneffecient basically.  Each process of apache has
it's registry which holds the compiled perl scripts in..., a copy of
each for each process.  This has become an issue for one of the
companies that I work for, and I noted from monitoring the list that
some people have apache processes that are upwards of 25Megs, which is
frankly ridiculous.

This is not meant to be a flame, and I'd really like to get down to
the nitty gritty on how we can solve this problem.  Zach Brown wrote
phhttpd which is a threaded server which can handle a lot more load
than apache, but the problem is it doesn't have the features that
Apache has, and it's not going to catch up any time soon so I think
it's not going to be the cure-all.  I wrote a very small perl engine
for phhttpd that worked within it's threaded paradigm that sucked up a
neglibible amount of memory which used a very basic version of
Apache's registry.  Once again though it didn't have the feature set
that Apache/modperl has.  Now to address the issue: I think we have a lot of
code in Modperl that is basically awesome.  Not to mention the fact
that Apache itself has a lot of modules and other things which are
quite usefull.  However I think if it were possible to divorce the
actually perl engine from the Apache process we could solve this memory
usage problem.

Basically heres what I'm thinking might be possible, but if it's not
just let me know.  (Well, I know it's possible, but I mean how much
work would it take to institute, and has someone else worked on this,
or taken a look at how much work we'd be talking about)  What I'm
thinking is essentially we take the perl engine which has the apache
registry and all the perl symbols etc., and seperate it into it's own
process which would could be multithreaded (via pthreads) for multiple
processor boxes.  (above 2 this would be beneficial probably)  On the
front side the apache module API would just connect into this other
process via shared memory pages (shmget et. al), or Unix pipes or
something like that.  The mod_perl process would have a work queue
that the Apache processes could add work to via our front end API.
The work threads inside of that mod_perl process would take work
"orders" out of the work queue and process them and send the result
back to the waiting apache process.  (Maybe just something as simple
as a blocking read on a pipe coming out of the mod_perl process...
this would keep down context switching issues and other nasty bits)

One of my concerns is that maybe the apache module API is simply too
complex to pull something like this off.  I don't know, but it seems
like it should be able to handle something like this.

Does anyone know of any program which has been developed like this?
Basically we'd be turning the "module of apache" portion of mod_perl
into a front end to the "application server" portion of mod_perl that
would do the actual processing.  It seems quite logical that something
like this would have been developed, but possibly not.  The seperation
of the two components seems like it should be done, but there must be
a reason why no one has done it yet... I'm afraid this reason would be
the apache module API doesn't lend itself to this.

Well, thanks to everyone in advance for their thoughts/comments...
Shane Nay.



[Slightly OT] IPC::Open3 broken in mod_perl/perl 5.6.0?

2000-04-15 Thread Michael J Schout

Sorry if this is slightly off topic.

I seem to have run into problems using IPC::Open3 under mod_perl 1.22 and perl
5.6.0.  This probelm only seems to have cropped up after I upgraded form perl
5.005 to perl 5.6.0.  What happens is I have an exception handler that opens
gpg and uses gpg to encrypt some data and email it to me in the case of an
exception.  The code looks like this:

my $input  = new IO::Handle;
my $output = new IO::Handle;
my $error  = new IO::Handle;

my $cmd = "gpg --homedir /etc/httpd/perl/gkgnsi/gpg -r $to -ea";
my $pid = IPC::Open3::open3($input, $output, $error, $cmd);

THis worked fine when I was running perl 5.005, but now it dies here, and the
apache error_log shows:

[error] Can't locate object method "OPEN" via package "Apache" at
/usr/lib/perl5/5.6.0/IPC/Open3.pm line 132.

Strange...

Has anyone seen anything like this?  Anyone have any ideas?

Thanks.
Mike




Re: Problem with Apache::SIG

2000-04-15 Thread Stas Bekman

On Wed, 12 Apr 2000 [EMAIL PROTECTED] wrote:

> Hi All,
> 
> Recently I installed Apache-1.3.12 with mod_perl-1.22. Standard
> installation. Everything seemed to work great.
> 
> I'm using the directive
> PerlFixupHandler Apache::SIG
> 
> because you have some 'alive' scripts that need to be killed if
> the user closes his browser.
> 
> Well, everything seems to work fine, but in Apache error_log
> we get the message:
> 
> [Mon Apr 10 22:27:01 2000] [error]  at
> /usr/lib/perl5/site_perl/5.005/i386-linux
> /Apache/SIG.pm line 31.
> 
> Line 31 is Apache::exit($s);
> 
> What is wrong ?

Try to install the die call tracer in the startup file to reveal where the
error comes from:

  use Carp qw(verbose);
  $SIG{__DIE__} = \&Carp::confess;

Use it only to trace this problem, remove this setting when you are done. 
You might want to read the explanation of the problems you might get into
when using __DIE__ by Matt Sergeant from a few weeks ago. The new version
of the Guide will include it. 

__
Stas Bekman | JAm_pH--Just Another mod_perl Hacker
http://stason.org/  | mod_perl Guide  http://perl.apache.org/guide 
mailto:[EMAIL PROTECTED]  | http://perl.orghttp://stason.org/TULARC/
http://singlesheaven.com| http://perlmonth.com http://sourcegarden.org
--




Please ignore my previous question

2000-04-15 Thread Gagan Prakash


** Web App Development  Needs? ***
  Contact OSATech today! http://www.OSATech.com
***

- Original Message -
From: "Gagan Prakash" <[EMAIL PROTECTED]>
To: "modperl (E-mail)" <[EMAIL PROTECTED]>
Sent: Saturday, April 15, 2000 4:49 AM
Subject: Httpd.conf and mod_perl as DSO


> Hi,
>
> I've just gotten a server setup with iserver and have installed mod_perl
as
> per instructions however I am at a loss to figure out where my mod_perl
> scripts are supposed to reside. As per iserver directions I have added
> LoadModule perl_module modules/mod_perl.so
> to my httpd.conf. Apache is coming up fine and the Header line is telling
me
> that mod_perl was successfully loaded.
>
> However, I am unable to figure out how to specify the directory where my
> files should be picked up and processed as mod_perl ones. Would greatly
> appreciate help.
>
> Thanks
> Gagan
> ** Web App Development  Needs? ***
>   Contact OSATech today! http://www.OSATech.com
> ***
>
> - Original Message -
> From: "Robert Jenks" <[EMAIL PROTECTED]>
> To: "'Ken Y. Clark'" <[EMAIL PROTECTED]>; "modperl (E-mail)"
> <[EMAIL PROTECTED]>
> Sent: Saturday, April 15, 2000 1:36 AM
> Subject: RE: Compiler errors...
>
>
> > Thanks Ken!  It worked like a charm!
> >
> > -Robert
> >
>




Httpd.conf and mod_perl as DSO

2000-04-15 Thread Gagan Prakash

Hi,

I've just gotten a server setup with iserver and have installed mod_perl as
per instructions however I am at a loss to figure out where my mod_perl
scripts are supposed to reside. As per iserver directions I have added
LoadModule perl_module modules/mod_perl.so
to my httpd.conf. Apache is coming up fine and the Header line is telling me
that mod_perl was successfully loaded.

However, I am unable to figure out how to specify the directory where my
files should be picked up and processed as mod_perl ones. Would greatly
appreciate help.

Thanks
Gagan
** Web App Development  Needs? ***
  Contact OSATech today! http://www.OSATech.com
***

- Original Message -
From: "Robert Jenks" <[EMAIL PROTECTED]>
To: "'Ken Y. Clark'" <[EMAIL PROTECTED]>; "modperl (E-mail)"
<[EMAIL PROTECTED]>
Sent: Saturday, April 15, 2000 1:36 AM
Subject: RE: Compiler errors...


> Thanks Ken!  It worked like a charm!
>
> -Robert
>