Re: mod_perl from cvs on windows

2000-01-09 Thread Matt Sergeant

On Sun, 09 Jan 2000, Ask Bjoern Hansen wrote:
 Hi,
 
 It would be very cool if we could get a new release out soon, but I am not
 sure the latest mod_perl from cvs have been tested on windows? Could
 anyone give a "ok" report? We shoud know about both "standard win32" perl
 and ActiveState Perl. (Jochen Wiedmann provided patches for making it
 compile on ActiveState Perl).

compile but not run - I believe.

-- 
Matt/

Details: FastNet Software Ltd - XML, Perl, Databases.
Tagline: High Performance Web Solutions
Web Sites: http://come.to/fastnet http://sergeant.org
Available for Consultancy, Contracts and Training.



Search engines with mod_perl

2000-01-09 Thread Bill Moseley

I'm looking for a simple search engine server to use with mod_perl.

Does anyone have experience or recommendations on what works well (or
doesn't work well) when developing with mod_perl?  Phrase searching, word
truncation, word stemming, and thesaurus lookup are desired features.

This would be to manage over 10,000 web 'references' - a typical web search
engine.

Thanks,

Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Search engines with mod_perl

2000-01-09 Thread Matt Sergeant

On Sun, 09 Jan 2000, Bill Moseley wrote:
 I'm looking for a simple search engine server to use with mod_perl.
 
 Does anyone have experience or recommendations on what works well (or
 doesn't work well) when developing with mod_perl?  Phrase searching, word
 truncation, word stemming, and thesaurus lookup are desired features.
 
 This would be to manage over 10,000 web 'references' - a typical web search
 engine.

Any particular reason it has to work with mod_perl? If not - "ht:dig" works
very well in my experience.

-- 
Matt/

Details: FastNet Software Ltd - XML, Perl, Databases.
Tagline: High Performance Web Solutions
Web Sites: http://come.to/fastnet http://sergeant.org
Available for Consultancy, Contracts and Training.



Re: Search engines with mod_perl

2000-01-09 Thread Bill Moseley

At 04:21 PM 1/9/00 +, Matt Sergeant wrote:
On Sun, 09 Jan 2000, Bill Moseley wrote:
 I'm looking for a simple search engine server to use with mod_perl.
 
 Does anyone have experience or recommendations on what works well (or
 doesn't work well) when developing with mod_perl?  Phrase searching, word
 truncation, word stemming, and thesaurus lookup are desired features.
 
 This would be to manage over 10,000 web 'references' - a typical web search
 engine.

Any particular reason it has to work with mod_perl?

Something that doesn't require a fork for each request.

 If not - "ht:dig" works very well in my experience.

I didn't think ht://Dig did phrase searches.  I also wasn't clear if you
could define fields to limit a search with ht:://Dig.

I'm looking to update the site: http://lii.org

Thanks,


Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Search engines with mod_perl

2000-01-09 Thread siberian

You could try UDMSearch. I run it as a registry script wrapped in
Apache::SSI and it works pretty well. Not sure if it meets all your
requirements or not but its worth a gander.

http://mysearch.udm.net/

John-

On Sun, 9 Jan 2000, Bill Moseley wrote:

 At 04:21 PM 1/9/00 +, Matt Sergeant wrote:
 On Sun, 09 Jan 2000, Bill Moseley wrote:
  I'm looking for a simple search engine server to use with mod_perl.
  
  Does anyone have experience or recommendations on what works well (or
  doesn't work well) when developing with mod_perl?  Phrase searching, word
  truncation, word stemming, and thesaurus lookup are desired features.
  
  This would be to manage over 10,000 web 'references' - a typical web search
  engine.
 
 Any particular reason it has to work with mod_perl?
 
 Something that doesn't require a fork for each request.
 
  If not - "ht:dig" works very well in my experience.
 
 I didn't think ht://Dig did phrase searches.  I also wasn't clear if you
 could define fields to limit a search with ht:://Dig.
 
 I'm looking to update the site: http://lii.org
 
 Thanks,
 
 
 Bill Moseley
 mailto:[EMAIL PROTECTED]
 



Perl modules in apache configuration

2000-01-09 Thread tarkhil

Hello!

I'm trying to configure httpd.conf using Perl sections (mod_macro is
not enough for me), but the result is weird. 

The most weird thing is that Perl sections randomly doesn't execute! I
have no experience (yet) with Perl configuration modules, so I don't
understand where to start tracking.

mod_perl developers' guide didn't help, as perldoc mod_perl :-(

-- 
Alexander B. Povolotsky[ICQ 18277558]
[2:5020/145][[EMAIL PROTECTED]]



RE: URL Redirection

2000-01-09 Thread Gerald Richter




  
  As I was having problem to set up Embperl as CGI, 
  I found out a way to workaround
  that. To call HTML::Embperl::Execute from my perl 
  script and pass the html
  document as parameter. This works 
OK.
  
That is, like Tom and Vivek pointed out, cgiwrapper eats the 
PATH_INFO and PATH_TRANSLATED

  
  What I want to achieve is to have the 
  some session info (basically I want to create
  my own session persistant data for various 
  statistical purposes) to be part of
  the URL. (I mean the browser address to show 
  something like 
  http://my.domain.com/myplscript.pl?session=1id=21). I am using a generic
  perl script to manage all pages and using embperl 
  to link each together. My problem
  is the "Location: 
  myscript.pl?session=1id=21" is not showing in the address 
  line
  fully, but executing properly. This will give me 
  a problem if the user reloads or move backwards
  from other page. I want the address line to show 
  full url address including the parameters
  when I redirect it so that reloads and backwards 
  will work properly.
  
Sorry, but I don't understand what you 
mena

Gerald


-Gerald 
Richter ecos electronic communication services 
gmbhInternetconnect * Webserver/-design/-datenbanken * 
ConsultingPost: Tulpenstrasse 
5 D-55276 Dienheim b. 
MainzE-Mail: 
[EMAIL PROTECTED] 
Voice: +49 6133 
925151WWW: http://www.ecos.de 
Fax: +49 6133 
925152-



Re: Perl modules in apache configuration

2000-01-09 Thread Eric

On Sun, Jan 09, 2000 at 08:47:04PM +0300, [EMAIL PROTECTED] wrote:
 Hello!
 
 I'm trying to configure httpd.conf using Perl sections (mod_macro is
 not enough for me), but the result is weird. 
 
 The most weird thing is that Perl sections randomly doesn't execute! I
 have no experience (yet) with Perl configuration modules, so I don't
 understand where to start tracking.
 
 mod_perl developers' guide didn't help, as perldoc mod_perl :-(

Do you have a specific example of your config, and what doesn't work,
that you could post maybe? It's hard to help without specifics.

-- 
Eric Cholet



Re: Perl modules in apache configuration

2000-01-09 Thread tarkhil

 "Eric" == Eric  writes:

Eric On Sun, Jan 09, 2000 at 08:47:04PM +0300, [EMAIL PROTECTED] wrote:
 I'm trying to configure httpd.conf using Perl sections (mod_macro is
 not enough for me), but the result is weird. 

Eric Do you have a specific example of your config, and what doesn't work,
Eric that you could post maybe? It's hard to help without specifics.
Okay, here it is. Note that fragment marked #!!! is critical for some
bugs: when these strings are commented out, first Perl block
executes with error, if they are uncommented, it does NOT
executes. Second Perl block never executes at all.

=== cut ===
##
## httpd.conf -- Apache HTTP server configuration file
##
# $Id: httpd.conf,v 1.2 2000/01/09 10:49:48 root Exp root $
#
# $Log: httpd.conf,v $
# Revision 1.2  2000/01/09 10:49:48  root
# It is working now; I'll start reconfiguring it with Perl,
# including Perl configuration for virtual hosts
#
#
ServerType standalone

ServerRoot "/usr/local"

#LockFile /var/run/httpd.lock

PidFile /var/run/httpd.pid

ScoreBoardFile /var/run/httpd.scoreboard

#ResourceConfig conf/srm.conf
#AccessConfig conf/access.conf

Timeout 300

KeepAlive On
MaxKeepAliveRequests 100
KeepAliveTimeout 15

MinSpareServers 5
MaxSpareServers 10
StartServers 5
MaxClients 40
MaxRequestsPerChild 0

#
# Listen: Allows you to bind Apache to specific IP addresses and/or
# ports, in addition to the default. See also the VirtualHost
# directive.
#
#Listen 3000
#Listen 12.34.56.78:80

#
# BindAddress: You can support virtual hosts with this option. This directive
# is used to tell the server which IP address to listen to. It can either
# contain "*", an IP address, or a fully qualified Internet domain name.
# See also the VirtualHost and Listen directives.
#
#BindAddress *

#
# Dynamic Shared Object (DSO) Support
#
# To be able to use the functionality of a module which was built as a DSO you
# have to place corresponding `LoadModule' lines at this location so the
# directives contained in it are actually available _before_ they are used.
# Please read the file README.DSO in the Apache 1.3 distribution for more
# details about the DSO mechanism and run `httpd -l' for the list of already
# built-in (statically linked and thus always available) modules in your httpd
# binary.
#
# Note: The order is which modules are loaded is important.  Don't change
# the order below without expert advice.
#
# Example:
# LoadModule foo_module libexec/mod_foo.so
LoadModule vhost_alias_module libexec/apache/mod_vhost_alias.so
LoadModule env_module libexec/apache/mod_env.so
LoadModule config_log_module  libexec/apache/mod_log_config.so
LoadModule mime_magic_module  libexec/apache/mod_mime_magic.so
LoadModule mime_modulelibexec/apache/mod_mime.so
LoadModule negotiation_module libexec/apache/mod_negotiation.so
LoadModule status_module  libexec/apache/mod_status.so
LoadModule info_modulelibexec/apache/mod_info.so
LoadModule includes_modulelibexec/apache/mod_include.so
LoadModule autoindex_module   libexec/apache/mod_autoindex.so
LoadModule dir_module libexec/apache/mod_dir.so
LoadModule cgi_module libexec/apache/mod_cgi.so
LoadModule asis_modulelibexec/apache/mod_asis.so
LoadModule imap_modulelibexec/apache/mod_imap.so
LoadModule action_module  libexec/apache/mod_actions.so
LoadModule speling_module libexec/apache/mod_speling.so
LoadModule userdir_module libexec/apache/mod_userdir.so
LoadModule alias_module   libexec/apache/mod_alias.so
LoadModule rewrite_module libexec/apache/mod_rewrite.so
LoadModule access_module  libexec/apache/mod_access.so
LoadModule auth_modulelibexec/apache/mod_auth.so
LoadModule anon_auth_module   libexec/apache/mod_auth_anon.so
LoadModule db_auth_module libexec/apache/mod_auth_db.so
LoadModule digest_module  libexec/apache/mod_digest.so
#LoadModule proxy_module   libexec/apache/libproxy.so
LoadModule cern_meta_module   libexec/apache/mod_cern_meta.so
LoadModule expires_module libexec/apache/mod_expires.so
LoadModule headers_module libexec/apache/mod_headers.so
LoadModule usertrack_module   libexec/apache/mod_usertrack.so
LoadModule unique_id_module   libexec/apache/mod_unique_id.so
LoadModule setenvif_modulelibexec/apache/mod_setenvif.so
LoadModule perl_modulelibexec/apache/libperl.so
LoadModule php3_modulelibexec/apache/libphp3.so
LoadModule dav_module libexec/apache/libdav.so

#  Reconstruction of the complete module list from all available modules
#  (static and shared ones) to achieve correct module execution order.
#  [WHENEVER YOU CHANGE THE LOADMODULE SECTION ABOVE UPDATE THIS, TOO]
ClearModuleList
AddModule mod_charset.c
AddModule mod_vhost_alias.c
AddModule mod_env.c
AddModule mod_log_config.c
AddModule mod_mime_magic.c
AddModule mod_mime.c
AddModule mod_negotiation.c
AddModule mod_status.c
AddModule mod_info.c
AddModule mod_include.c
AddModule mod_autoindex.c
AddModule mod_dir.c
AddModule 

Trouble installing mod_perl

2000-01-09 Thread gnielson

I am encountering some errors when trying to get an existing Apache
server to support mod_perl. 

I had no problem with perl with Makefile.pl. I then ran
./config.status --activate-module=src/modules/perl/libperl.a without a
problem. But then, when I ran make, I got a slew of errors, listed below.

I am trying to preserve previous additions to my apache setup, which
includes php. What am I doing wrong? I noted in the mailing list archive
that there are other ways to upgrade an existing installation of apache,
but I followed the basic INSTALL file instructions.

Should I -- can I -- re-run the entire installation, including running the
Makefile for perl again?

I am running Server version Apache/1.2.4 and perl, version 5.004_01. Is
the fact that I have not yet ungraded to 5.004_04 giving me these
problems?

Any help appreciated. I have tried several times without success.

Here is an excerpt of some of the errors:

  -L/usr/lib/mysql -lmysqlclient
modules/perl/libperl.a(mod_perl.o): In function `perl_shutdown':
mod_perl.o(.text+0xf8): undefined reference to `perl_destruct_level'
mod_perl.o(.text+0x102): undefined reference to `perl_destruct_level'
mod_perl.o(.text+0x10c): undefined reference to `perl_destruct_level'
mod_perl.o(.text+0x13b): undefined reference to `Perl_av_undef'
mod_perl.o(.text+0x149): undefined reference to `Perl_sv_free'
mod_perl.o(.text+0x161): undefined reference to `Perl_av_undef'
mod_perl.o(.text+0x16f): undefined reference to `Perl_sv_free'
mod_perl.o(.text+0x187): undefined reference to `perl_destruct'
mod_perl.o(.text+0x195): undefined reference to `perl_free'
modules/perl/libperl.a(mod_perl.o): In function `perl_restart':
mod_perl.o(.text+0x206): undefined reference to `perl_get_sv'
mod_perl.o(.text+0x21a): undefined reference to `Perl_gv_stashpv'
mod_perl.o(.text+0x227): undefined reference to `Perl_push_scope'
mod_perl.o(.text+0x22c): undefined reference to `warnhook'
mod_perl.o(.text+0x231): undefined reference to `Perl_save_sptr'
mod_perl.o(.text+0x240): undefined reference to `perl_eval_pv'
mod_perl.o(.text+0x24a): undefined reference to `warnhook'
mod_perl.o(.text+0x255): undefined reference to `Perl_sv_undef'
mod_perl.o(.text+0x25e): undefined reference to `Perl_sv_setsv'
. etc etc etc 



Re: URL Redirection

2000-01-09 Thread Vijay



Hello,

I got a work around for the problem I am having. 


- Set a cookie header using meta command as 
set-cookie is not working properly
 on my box thru mod_perl. Somewhere it 
is creating a problem and I am not sure
 where. Using meta command I set up a 
cookie for the session id.
- Create a file on my disk with all 
parameters selected by the user with the session-id
 as the file id.
- Get the session-id using the environment 
variable $ENV('HTTP_COOKIE'}
- Load the file with the session id and 
populate the user selected parameters as well as
 add new selections
- Pass-on these parameters to the web pages 
(Embperl encoded) using the
 HTML::Embperl::Execute 
command.

At present, all these are done thru three 
scripts.

Although the browser address is not showing 
thefull URL, it shows correct results.
I am trying more to refine this and createone 
generic Perl script to handle all pages in
my site.

Thanks for all your help.

Vijay 

  - Original Message - 
  From: 
  Gerald Richter 
  
  To: Vijay ; mod_perl Maillinglist 
  Sent: Sunday, January 09, 2000 2:01 
  PM
  Subject: RE: URL Redirection 
  
  

As I was having problem to set up Embperl as 
CGI, I found out a way to workaround
that. To call HTML::Embperl::Execute from my 
perl script and pass the html
document as parameter. This works 
OK.

  That is, like Tom and Vivek pointed out, cgiwrapper eats the 
  PATH_INFO and PATH_TRANSLATED
  

What I want to achieve is to have the 
some session info (basically I want to create
my own session persistant data for various 
statistical purposes) to be part of
the URL. (I mean the browser address to show 
something like 
http://my.domain.com/myplscript.pl?session=1id=21). I am using a 
generic
perl script to manage all pages and using 
embperl to link each together. My problem
is the "Location: 
myscript.pl?session=1id=21" is not showing in the address 
line
fully, but executing properly. This will give 
me a problem if the user reloads or move backwards
from other page. I want the address line to 
show full url address including the parameters
when I redirect it so that reloads and 
backwards will work properly.

  Sorry, but I don't understand what you 
  mena
  
  Gerald
  
  
  -Gerald 
  Richter ecos electronic communication services 
  gmbhInternetconnect * Webserver/-design/-datenbanken * 
  ConsultingPost: Tulpenstrasse 
  5 D-55276 Dienheim b. 
  MainzE-Mail: 
  [EMAIL PROTECTED] 
  Voice: +49 6133 
  925151WWW: http://www.ecos.de 
  Fax: +49 6133 
  925152-
  


Cryptic errors -simple Apache::Registry script ??? (newbie)

2000-01-09 Thread John Walker

I've got a script (hello.pl from the Eagle book).
It runs succesfully once and generates 500 errors. 

So I dig around and find this in the errors:

[Sun Jan  9 15:26:38 2000] [error] Can't upgrade that kind of scalar at
/usr/lib/perl5/site_perl/5.005/i386-linux/Apache/Registry.pm line 32.

HMM. Well, I am not skilled enough to want to hack Registry.pm nor am I
familiar with the concept of upgrading a scalar. I have read Stas' guide
and he talks about "use diagnostics;" Now when things go bad, I get a
longer explanation:

[Sun Jan  9 15:40:07 2000] [error] Uncaught exception from user code:
Can't upgrade that kind of scalar at
/usr/lib/perl5/site_perl/5.005/i386-linux/Apache/Registry.pm line 32.
Apache::Registry::handler('Apache=SCALAR(0x81a782c)') called at
/dev/null line 0
eval {...} called at /dev/null line 0

HMM. dev null? line zero? Am I loosing my mind?

I would appreciate it if someone could help me in the right direction.
I'm running RedHat/Apache HTTPSD with mod_perl, does anyone know if
thats a problem?

[root@melanie /root]# telnet melanie.jsw4.net 80
Trying 216.207.143.5...
Connected to melanie.jsw4.net.
Escape character is '^]'.
GET /index.html HTTP/1.1
Host: roc.jsw4.net

HTTP/1.1 200 OK
Date: Sun, 09 Jan 2000 20:57:18 GMT
Server: Red Hat Secure/3.0 (Unix) mod_perl/1.19
Last-Modified: Sun, 05 Dec 1999 16:28:37 GMT
ETag: "1803-797-384a92b5"
Accept-Ranges: bytes
Content-Length: 1943
Content-Type: text/html

HTML ...

The snips from httpsd.conf (This is in a virtual host section, could
that be a problem?)

IfModule mod_perl.c
  Alias /perl/ /home/roc/perl/
  PerlTaintCheck On
  Location /perl
SetHandler perl-script
PerlHandler Apache::Registry
PerlSendHeader On
Options +ExecCGI
  /Location
/IfModule

And finally the script: hello.pl (As I mentioned, this is copied fairly
faithfully from the eagle book.)

#!/usr/bin/perl -w
use CGI qw(:standard);
use diagnostics;
use strict;
# use vars qw($name);
my $realname = param('realname') || 'Anonymous';
print   header(),
start_html(-title='Hello',-bgcolor='blue'),
h1("Hello $realname"),
p(
"To change your name, enter it into the text field below and
press",
em("change name.")
),
start_form(),
"Name:
",textfield(-name='realname',-value='Anonymous'),
submit(-value='Change name'),
end_form(),
hr(),
end_html();

Any help would be appreciated. Thanks,
John



Memory leak/server crashes

2000-01-09 Thread James Furness

I'm looking for some help getting apache to run reliably. Apache 1.3.9 with
mod_perl 1.21 and Perl 5.005_03 is running on a dual P500 with 1 Gb of RAM
running Redhat 6.1. We run about 5 sites off the box, most of which are
fairly high traffic, and use a lot of CGI and
MySQL 3.22.25 is used with Apache::DBI.

The major problem seems to be a memory leak of some sort, identical to that
described in the "memory leak in mod_perl" thread on this list from October
1997 and the "httpd, mod_perl and memory consumption (long)" thread from
July 1997.

The server runs normally for several hours, then suddenly a httpd process
starts growing exponentially, the swapfile usage grows massively and the
server starts to become sluggish (I assume due to disk thrashing caused by
the heavy swap usage). Usually when this started to happen I would log in
and use apachectl stop to shutdown the server, then type 'killall httpd'
several times till the processes finally died off, and then use apachectl
start to restart apache. If I was not around or did not catch this, the
server would eventually become unresponsive and lock up, requiring a manual
reboot by the datacentre staff. Messages such as "Out of memory" and
"Callback called exit" would appear in the error log as the server spiralled
down and MySQL would start to have trouble running.

To combat this, I created a script to monitor load and swapfile usage, and
restart apache as described above if load was above 7 and swapfile usage
above 150Mb. This script has kept the server online and we now have an
uptime of something like 22 days (previously no more than 1 day), but the
script is getting triggered several times a day and no more "Out of memory"
messages are appearing, but the situation is not ideal.

I have tried adding:

sub UNIVERSAL::AUTOLOAD {
my $class = shift;
Carp::cluck "$class can't \$UNIVERSAL::AUTOLOAD!\n";
}


As recommended by the developers guide, which flooded the error log with the
text below being printed roughly once a second in the error log:

-
Apache=SCALAR(0x830937c) can't $UNIVERSAL::AUTOLOAD!
Apache=SCALAR(0x8309364) can't $UNIVERSAL::AUTOLOAD!
DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
--

I've pretty much exhausted any ways I can think of to trace this problem,
such as i've tried to eliminate memory leaks in code by removing some
scripts from mod_perl and running them under mod_cgi and i've tried tweaking
MaxRequestsPerChild both without any success.

One thing that was mentioned in a previous thread was that using 'exit'
could confuse perl, and exit() is used fairly heavily in the scripts since
most are converted to mod_perl from standard CGIs, but i'd prefer not to
have to remove these since the structure of the scripts is reliant on some
form of exit statement. Is there some alternative to exit()?

I've also had a look at some of the patches to Apache.pm and Apache.xs
suggested in the previous threads, and these seem to have been incorporated
into mod_perl 1.21.

Are there any other solutions I could try to this problem? Does anyone know
what might be causing this?

The second problem I have is when loading pages, usually CGI, but I think
this has happened on some static pages, what IE5 describes as "Server not
found or DNS error" is experienced. Originally I thought this was the server
hitting MaxClients (150) since it usually occurs at the same time as massive
surges of hits, and /server-status usually shows 150 httpd processes have
been spawned, however I increased MaxClients to 200 recently and the error
has continued to happen, even though /server-status doesn't show any more
than about 170 processes spawned. I have not ruled out DNS server troubles
or backbone problems (We've had a few routing troubles recently that slowed
things down, but not actually cut off traffic or anything like that), but I
am at a loss as to what else could be causing this so I thought i'd ask
whilst i'm on the subject of server problems :)

Thanks in advance,
--
James Furness [EMAIL PROTECTED]
ICQ #:  4663650



setting query in PerlTransHandler

2000-01-09 Thread Ajay Shah

This maybe be repeated becuase I sent the first message via
Geo Crawlere and don't know how long they are going to take
to review the message. Sorry if it comes in twice.

I am writing a simple PerlTransHandler that is going to change
the request into another with query string.
The following is what I am looking for

/articles/10/index.html  =  /articles/index.html?id=10

This is what I tried.

sub handler {
my $r = shift;
my $uri = $r-uri;

my ($id) = ($uri =~ m|^/articles/(.*?)/|);
my $newuri = $r-document_root . "/articles/index.html";
my $uriobj = $r-parsed_uri;
$uriobj-query("id=$id");
$r-uri($newuri);

return OK;
}

1;

All .html documents are being parsed by Embperl and that
works fine.

In my document $fdat{id} doesn't return anything as I think
it should be returing 10.

printing out Apache-request-parsed_uri-query in the
Embperl document doesn't print anything.

In the book there only one sentece about using the query()
method. and even that doesn't explain how to set up key=value
to return it later.

Any help is appreciated.

Ajay
__
Get Your Private, Free Email at http://www.hotmail.com



Re: Memory leak/server crashes

2000-01-09 Thread Sean Chittenden

Try using Apache::SizeLimit as a way of controlling your
processes.  Sounds like a recursive page that performs infinite internal
requests.

-- 
Sean Chittenden  [EMAIL PROTECTED]
fingerprint = 6988 8952 0030 D640 3138  C82F 0E9A DEF1 8F45 0466

My mother once said to me, "Elwood," (she always called me Elwood)
"Elwood, in this world you must be oh so smart or oh so pleasant."
For years I tried smart.  I recommend pleasant.
-- Elwood P. Dowde, "Harvey"

On Sun, 9 Jan 2000, James Furness wrote:

 Date: Sun, 9 Jan 2000 19:58:00 -
 From: James Furness [EMAIL PROTECTED]
 Reply-To: James Furness [EMAIL PROTECTED]
 To: [EMAIL PROTECTED]
 Subject: Memory leak/server crashes
 
 I'm looking for some help getting apache to run reliably. Apache 1.3.9 with
 mod_perl 1.21 and Perl 5.005_03 is running on a dual P500 with 1 Gb of RAM
 running Redhat 6.1. We run about 5 sites off the box, most of which are
 fairly high traffic, and use a lot of CGI and
 MySQL 3.22.25 is used with Apache::DBI.
 
 The major problem seems to be a memory leak of some sort, identical to that
 described in the "memory leak in mod_perl" thread on this list from October
 1997 and the "httpd, mod_perl and memory consumption (long)" thread from
 July 1997.
 
 The server runs normally for several hours, then suddenly a httpd process
 starts growing exponentially, the swapfile usage grows massively and the
 server starts to become sluggish (I assume due to disk thrashing caused by
 the heavy swap usage). Usually when this started to happen I would log in
 and use apachectl stop to shutdown the server, then type 'killall httpd'
 several times till the processes finally died off, and then use apachectl
 start to restart apache. If I was not around or did not catch this, the
 server would eventually become unresponsive and lock up, requiring a manual
 reboot by the datacentre staff. Messages such as "Out of memory" and
 "Callback called exit" would appear in the error log as the server spiralled
 down and MySQL would start to have trouble running.
 
 To combat this, I created a script to monitor load and swapfile usage, and
 restart apache as described above if load was above 7 and swapfile usage
 above 150Mb. This script has kept the server online and we now have an
 uptime of something like 22 days (previously no more than 1 day), but the
 script is getting triggered several times a day and no more "Out of memory"
 messages are appearing, but the situation is not ideal.
 
 I have tried adding:
 
 sub UNIVERSAL::AUTOLOAD {
 my $class = shift;
 Carp::cluck "$class can't \$UNIVERSAL::AUTOLOAD!\n";
 }
 
 
 As recommended by the developers guide, which flooded the error log with the
 text below being printed roughly once a second in the error log:
 
 -
 Apache=SCALAR(0x830937c) can't $UNIVERSAL::AUTOLOAD!
 Apache=SCALAR(0x8309364) can't $UNIVERSAL::AUTOLOAD!
 DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
 IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
 DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
 IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
 DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
 IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
 DBI::DBI_tie=HASH(0x82dd16c) can't $UNIVERSAL::AUTOLOAD!
 IO::Handle=IO(0x820aabc) can't $UNIVERSAL::AUTOLOAD!
 --
 
 I've pretty much exhausted any ways I can think of to trace this problem,
 such as i've tried to eliminate memory leaks in code by removing some
 scripts from mod_perl and running them under mod_cgi and i've tried tweaking
 MaxRequestsPerChild both without any success.
 
 One thing that was mentioned in a previous thread was that using 'exit'
 could confuse perl, and exit() is used fairly heavily in the scripts since
 most are converted to mod_perl from standard CGIs, but i'd prefer not to
 have to remove these since the structure of the scripts is reliant on some
 form of exit statement. Is there some alternative to exit()?
 
 I've also had a look at some of the patches to Apache.pm and Apache.xs
 suggested in the previous threads, and these seem to have been incorporated
 into mod_perl 1.21.
 
 Are there any other solutions I could try to this problem? Does anyone know
 what might be causing this?
 
 The second problem I have is when loading pages, usually CGI, but I think
 this has happened on some static pages, what IE5 describes as "Server not
 found or DNS error" is experienced. Originally I thought this was the server
 hitting MaxClients (150) since it usually occurs at the same time as massive
 surges of hits, and /server-status usually shows 150 httpd processes have
 been spawned, however I increased MaxClients to 200 recently and the error
 has continued to happen, even though /server-status doesn't show any more
 than about 170 processes spawned. I have not ruled out DNS server troubles
 or backbone problems (We've had a few routing troubles 

setting query in TransHandler

2000-01-09 Thread Ajay Shah

This message was sent from Geocrawler.com by "Ajay Shah" [EMAIL PROTECTED]
Be sure to reply to that address.

Hello list,

This is my first post so bear with me.

I am trying to write a simple PerlTransHandler that is going to modify the uri
into something. I want to modify the following

/articles/10/index.html   =  /articles/index.html?id=10

So this is my attempt:

...

my ($id) = ($uri =~ m|^/articles/(.*?)/|);
my $newuri = $r-document_root . "/articles/index.html";
my $uriobj = $r-parsed_uri;
$uriobj-query("id=$id");
$r-uri($newuri); 
  
return OK;

...


All .html files are parsed by Embperl and that works fine.
What doesnt work is the $fdat{id} doesn't give me the id
that was set.. 

printing out Apache-request-parsed_uri-query
inside the index.html file doesn't give back anything..

I also tried to set the query string directly 
 $r-uri("$newuri?id=$id");  
but that didn't work either.

any help or suggestions are welcomed. the book only
has one sentecne on using the query() method. thanks.

Ajay

Geocrawler.com - The Knowledge Archive



Re: Memory leak/server crashes

2000-01-09 Thread James Furness

 Try using Apache::SizeLimit as a way of controlling your
 processes.  Sounds like a recursive page that performs infinite internal
 requests.

Ok, sounds like a good solution, but it still seems to me I should be
eliminating the problem at the source. Any ideas as to how I could narrow
down the location of whatever's causing the recursion?
--
James Furness [EMAIL PROTECTED]
ICQ #:  4663650



Re: Memory leak/server crashes

2000-01-09 Thread Chip Turner

"James Furness" [EMAIL PROTECTED] writes:

 I'm looking for some help getting apache to run reliably. Apache 1.3.9 with
 mod_perl 1.21 and Perl 5.005_03 is running on a dual P500 with 1 Gb of RAM
 running Redhat 6.1. We run about 5 sites off the box, most of which are
 fairly high traffic, and use a lot of CGI and
 MySQL 3.22.25 is used with Apache::DBI.
 
 The major problem seems to be a memory leak of some sort, identical to that
 described in the "memory leak in mod_perl" thread on this list from October
 1997 and the "httpd, mod_perl and memory consumption (long)" thread from
 July 1997.

[snip]

I too have had this problem and haven't found a suitable solution.  In
my case, though, I think the leaks are primarily do to old perl
scripts being run under Registry and not bugs in mod_perl or perl.

The first thing to do is to try to discover if the problem is a
mod_perl problem or a bad script problem.  If your server can handle
it, you could try a binary search to find which (if any) scripts make
the problem worse.  Basically pick half your registry scripts and use
mod_cgi.  If leaks persist, you know that you have some problem
scripts in the ones you didn't make mod_cgi.  If leaks stop, then you
know the problem scripts are in the ones you made mod_cgi.  Repeat as
necessary until you have narrowed it down to a single script.  This is
tedious though and may not be practical.

Depending on how old the scripts are, I would check for non-closed
filehandles, excessive global variables, not using strict, etc.
perl-status is your friend (hopefully you have it enabled!) so you can
see the namespaces of each httpd and see if you have any candidate
variables, file handles, functions, etc that could be clogging memory.

As a last resort, you could try Apache::SizeLimit to cap the size of
each httpd daemon.  This works reasonably well for us.  Something to
the effect:

use Apache::SizeLimit;

$Apache::SizeLimit::MAX_PROCESS_SIZE = 16384; 
$Apache::SizeLimit::CHECK_EVERY_N_REQUESTS = 3;

should help cap your processes at 16meg each.  Tweak as necessary.
Read the perldoc for Apache::SizeLimit for all the info you need.

Now, let's assume the problem is in fact in mod_perl or apache or perl
itself.  In this case I'm not sure what the best way to procede is.  I
think mod_perl and perl have shown themselves to be pretty good about
not leaking memory, as has apache.  IMO it's much, much more likely a
problem concerning Registry and impolite scripts that are misbehaving
and leaving parts of themselves around.

Have you tried correlating the memory surges with any page accesses?
That may help narrow down the culprit.

Good luck!

Chip

-- 
Chip Turner   [EMAIL PROTECTED]
  Programmer, ZFx, Inc.  www.zfx.com
  PGP key available at wwwkeys.us.pgp.net



Re: Memory leak/server crashes

2000-01-09 Thread Sean Chittenden

Yeah...  two things I'd do:

1)  Open two telnet sessions to the box.  One for top that is
monitoring processes for your web user (www typically) and is sorting by
memory usage w/ a 1 second refresh.  I'd change the size of the window and
make it pretty short so that the refreshes happen quicker, but that
depends on your connection speed.  The second telnet window is a window
that tails your access log (tail -f).  It sounds boring, but by watching
the two, you should have an idea as to when the problem happens.
2)  Open up and hack Apache::SizeLimit and have it do a stack dump
(Carp::croak) of what's going on... there may be some clue there.

Solution #1 will probably be your best bet...  Good luck (cool
site too!).  --SC

-- 
Sean Chittenden  [EMAIL PROTECTED]
fingerprint = 6988 8952 0030 D640 3138  C82F 0E9A DEF1 8F45 0466

The faster I go, the behinder I get.
-- Lewis Carroll

On Sun, 9 Jan 2000, James Furness wrote:

 Date: Sun, 9 Jan 2000 21:47:03 -
 From: James Furness [EMAIL PROTECTED]
 Reply-To: James Furness [EMAIL PROTECTED]
 To: Sean Chittenden [EMAIL PROTECTED]
 Cc: [EMAIL PROTECTED]
 Subject: Re: Memory leak/server crashes
 
  Try using Apache::SizeLimit as a way of controlling your
  processes.  Sounds like a recursive page that performs infinite internal
  requests.
 
 Ok, sounds like a good solution, but it still seems to me I should be
 eliminating the problem at the source. Any ideas as to how I could narrow
 down the location of whatever's causing the recursion?
 --
 James Furness [EMAIL PROTECTED]
 ICQ #:  4663650



Dedicated dynamic content servers...

2000-01-09 Thread Sean Chittenden

Is it possible to 'short circuit' some of the request handlers in
apache?
I'm building a dedicated dynamic content server that is getting
some really bizarre input through the URI and it doesn't map to a file,
instead a Translation Handler deals with the request and sets some
variables for an authorization handler that I have running later on.  
Because the one of the following core apache handlers checks $r-filename
before my Auth handler runs... I get 404ed.  What I can do is role all of
my functionality into a single Trans handler, but that's no good and not
something I'm interested in doing (bad programming, esp when you have LOTS
of developers working on the same project).  In a nut shell, how do I tell
Apache that I don't want it to touch down on disk?  Undef the filename or
set the file name to /dev/zero?

Thoughts?  Bizarre, tweaky, strange, or hacked ideas that have been
floating in the back of peoples' heads are okay and valid suggestions
(either to me personally or to the list).  I can make the ugliest of perl
look legit, so I'm game for just about anything.  --SC

-- 
Sean Chittenden  [EMAIL PROTECTED]




Re: Memory leak/server crashes

2000-01-09 Thread Stas Bekman

On Sun, 9 Jan 2000, Sean Chittenden wrote:

   Yeah...  two things I'd do:
 
   1)  Open two telnet sessions to the box.  One for top that is
 monitoring processes for your web user (www typically) and is sorting by
 memory usage w/ a 1 second refresh.  I'd change the size of the window and
 make it pretty short so that the refreshes happen quicker, but that
 depends on your connection speed.  The second telnet window is a window
 that tails your access log (tail -f).  It sounds boring, but by watching
 the two, you should have an idea as to when the problem happens.

Why, reinvent the wheel? I wrote Apache::VMonitor (grab from CPAN) that
does all this and more (all but tail -f) I use it all the time, saves me a
lot of time, since I don't have to telnet!

   2)  Open up and hack Apache::SizeLimit and have it do a stack dump
 (Carp::croak) of what's going on... there may be some clue there.

Apache::GTopLimit is an advanced one :) (you are on Linux, right?) but
Apache::SizeLimit is just file

3) try running in single mode with 'strace' (probably not a good idea for
a production server), but you can still strace all the processes into a
log file

4) Apache::Leak ?

___
Stas Bekmanmailto:[EMAIL PROTECTED]  http://www.stason.org/stas
Perl,CGI,Apache,Linux,Web,Java,PC http://www.stason.org/stas/TULARC
perl.apache.orgmodperl.sourcegarden.org   perlmonth.comperl.org
single o- + single o-+ = singlesheavenhttp://www.singlesheaven.com



Re: setting query in PerlTransHandler

2000-01-09 Thread Randal L. Schwartz

 "Ajay" == Ajay Shah [EMAIL PROTECTED] writes:

Ajay /articles/10/index.html  =  /articles/index.html?id=10

Ajay This is what I tried.

Ajay sub handler {
Ajay my $r = shift;
Ajay my $uri = $r-uri;

Ajay my ($id) = ($uri =~ m|^/articles/(.*?)/|);
Ajay my $newuri = $r-document_root . "/articles/index.html";
Ajay my $uriobj = $r-parsed_uri;
Ajay $uriobj-query("id=$id");
Ajay $r-uri($newuri);

Ajay return OK;
Ajay }

I may be wrong, but I bet you have to do this instead:


  $r-uri("/articles/index.html");
  $r-args("id=$id");

By the time the apache-request object has been created, args are
handled in a separate slot.

-- 
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!



Re: Dedicated dynamic content servers...

2000-01-09 Thread Randal L. Schwartz

 "Sean" == Sean Chittenden [EMAIL PROTECTED] writes:

Sean Because the one of the following core apache handlers checks $r-filename
Sean before my Auth handler runs... I get 404ed.  What I can do is role all of
Sean my functionality into a single Trans handler, but that's no good and not
Sean something I'm interested in doing (bad programming, esp when you have LOTS
Sean of developers working on the same project).  In a nut shell, how do I tell
Sean Apache that I don't want it to touch down on disk?  Undef the filename or
Sean set the file name to /dev/zero?

You should be able to return OK from your own pushed AccessHandler,
which should prevent the core modules from doing an .htaccess lookup
based on $r-filename.  Something like:

$r-push_handlers( PerlAccessHandler = sub { return OK } );

-- 
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!



Re: Weird message from make test

2000-01-09 Thread G.W. Haywood

Hi there,

On Sat, 8 Jan 2000, Nancy Lin wrote:

 If it's not the test script that's bad, then it would have to be
 CGI.pm, no?

No.

73
Ged.



Re: [JOB] ground-floor opportunity with ownership stake

2000-01-09 Thread Fabrice Scemama

Methinks certain `job' offers should be filtered...
Having several mod_perl engineers work for free for five months,
most businesses should be quite prosperous, indeed ;-)
Not my business anyway.

David Harris wrote:
 
 We have an outstanding ground-floor opportunity for the right person. Must have
 strong technical skills in mod_perl application development and some or all of
 the following: database development, c language, linux/unix system
 administration. Must also have the desire and ability to invest substantial
 "sweat equity" in exchange for a good ownership stake in a fast-growing
 web-hosting business.
 
 If you are the right person, you will become a partner in our business, and
 will end up owning a share which, according to our projections, will be worth
 $1.5 million within three years. The price of this ownership stake is working
 full time without pay for the first five months. You need to be prepared to
 make that investment. After the first five months a competitive compensation
 package will kick in.
 
 Technical skills we are seeking are:
   - mod_perl development
   - database development
   - c language patching of open source programs
   - in depth troubleshooting of complicated systems
   - linux/unix system administration
   - good grasp on unix fundamentals
   - good grasp on unix security precautions
 
 Responsibilities would include:
   - continuing mod_perl development on web control panel
   - helping create accounting database and integrating it
 into existing instant account setup system
   - administration of existing servers
   - limited, simple end user customer support
 
 If you wish to explore this opportunity in confidence, contact David Harris at
 [EMAIL PROTECTED]
 
  - David Harris
Principal Engineer, DRH Internet Services

-- 
"Quand on n'a besoin que de peu  de chose, un rien suffit, et quand un
rien suffit on n'a pas besoin de grand-chose." -- Pierre Dac



Dynamically-generated navigation bar

2000-01-09 Thread Don Schwarz

I'm trying to quickly create a web site for a project that I'm going to be
working on for the next few months.  So far, mod_perl has been a wonderful
help.  However, I now need to implement a nice site-wide navigation bar.
The site will be changing quickly and I don't want to spend much time
maintaining it, so I'd like the navigation bar to be generated
automatically.

For sites like this in the past, I've rolled my own mod_perl module to
look for certain requests and massage the HTML slightly to append a
navigation bar to the edge that's dynamically generated on each request.
However, this site will be hit a lot harder than those other sites were
and I'd like something a little more elegant.

I know about the Apache::NavBar module that is mentioned in Doug and
Lincoln's book; however, I am looking for more of a hierarchy-based list
that expands as people navigate it.  In addition, I would like to have a
site map produced automatically.

I would actually love to write something like this myself, but with the
short timescale that I'm on, I would really prefer not to reinvent the
wheel right now.  So, does anyone know of something out there that can do
this? (not necessarily even mod_perl-based, I'd settle for something under
mod_jserv at this point).  If not, how much interest is there in me
publishing something like this to CPAN if I get it written?  If I have the
time, I'd even like to write a few scripts to use the GIMP to create
graphical navbars for the site (ala GIMP's web site).

Thanks in advance,
Don



Re: Caching with $r-no_cache(1)

2000-01-09 Thread Randy Harmon

On Sun, Jan 09, 2000 at 08:45:11PM +, G.W. Haywood wrote:
 On Fri, 7 Jan 2000, Randy Harmon wrote:
 
  Does anybody have experience detecting such a condition, perhaps through one
  of the client headers?  I haven't had a chance to dump them - many hats.
 
 No idea - ditto.
 
  In any case, I could use some Javascript to package up the machine's
  current time and send it back to the server, for instance at the
  same point where I check whether a cookie was accepted.  That'd
  indicate their "Javscript-OK"-ness too.  I think I'm willing to
  assume that someone clueful enough to turn off Javascript is clueful
  enough to have the correct time.
 
 You might want to look at the excellent ntpd documentation which talks
 about things like network delays.  I think your Javascript idea is as

Fortunately, network lag only works in our favor when it comes to this
technique.  So it expires a few hundred milliseconds in the past instead of
"now"... no biggie to me.

 good a solution as you're going to get until the Web Comes Of Age.
 Don't know what you're going to do when I visit with Lynx though...

I'm going to hope that Lynx is smarter than Netscape on this point, and
assume that you're clueful enough to have the correct time. 

 Well, at least my clock _will_ be right, I run a level 3 timeserver.

He... I'm with you.  Oh, good, my assumption was right. :)

Randy