>
> The same thing happens to me running mod_perl with Apache
> and PostgreSQL.  I've kept the resources down by lowering
> MinSpareServers, MaxSpareServers, and ...uh, I forget the
> setting, but it's for number of clients accepted before a
> child server dies (MaxClients, maybe?).  But, I also
> don't run a major server at this particular site.
>
> One thing you may consider is running two apache's side
> by side: one to serve mostly static documents, which you
> configure ''normally'', and a second to handle your
> database requests, where you set MinSpares, etc. to
> values where they don't create too many postgres processes.
> This is a bit of extra work, but probably worth it,
> especially if you serve lots of static documents and
> only a few database requests.

    Actually I'm making some progress with a completely different
    approach.

    I've written a small  Apache  module  which  can  takeover  a
    complete  virtual host. This module connects permanently to a
    content server and asks for translation. The  content  server
    is  a  multi  process  TclX  application  which  looks  up  a
    PostgreSQL database for what to do with the URI.

    Some interesting details:

    o   The content server has a fixed number of  work  processes
        causing  a  fixed number of database connections. No need
        to lower Apache's performance.

    o   Everything can be stored in the database, HTML, HTML with
        Tcl  scripting,  images, other binaries and user data. No
        other scripting languages so far, but  I'm  at  the  very
        beginning.

    o   Images  must not be static. They could be drawn by a wish
        shell if there is an Xvfb running in the background. What
        I  have so far works up to animated GIF's drawn in canvas
        widgets. These images take arguments in their href!

    o   There is a very fine tunable caching. When implementing a
        WEB  application  based  on  CGI,  you  usually  tell the
        browsers  and  proxies  not  to  cache  that  output   by
        immediately  expiring it. That causes that your server is
        asked again and again for the same pages.

        The content server can cache the output of a  script  and
        triggers  can be easily setup so that changes in the user
        data cause server cache invalidation. Thus, the  browsers
        will  still  ask that many times for the content, but the
        script producing it will only be run when the  page/image
        changes.

    o   One  page  can  be  a collection of smaller includes. The
        includes themself are not accessible  from  the  outside.
        But  each  individual  include  is cacheable. The content
        server cache has a crossref so that the  cache  entry  of
        one  page/include  will  invalidate  if  another  one  it
        included does.

    Interested?


Jan

--

#======================================================================#
# It's easier to get forgiveness for being wrong than for being right. #
# Let's break this rule - forgive me.                                  #
#========================================= [EMAIL PROTECTED] (Jan Wieck) #


Reply via email to