Strange mod_perl2/Apache2 behavior -- I think

2003-08-20 Thread Bill Rini
Here's a little snippet from a sample script:

package MyApache::MyPackage;

use strict;
use warnings;
use Apache::RequestRec ();
use Apache::RequestIO ();
use Apache::DBI;
use APR::Table;
use Apache::Const qw(OK REDIRECT DECLINED);
use Apache::RequestUtil ();
sub handler {

my $r = shift;
my $current_uri = $r-uri;
my $content_type = $r-content_type;
my @fields = split(/\//, $current_uri);
# Let's see if the file exists first

return Apache::DECLINED unless !-e $r-filename;

# other stuff happens past this point but this is enough to document the 
problem.
}

Here's my the relevant settings in my config file:

 Location /b
   PerlSendHeader On
   SetHandler perl-script
   PerlHandler MyApache::MyPackage
 /Location
OK, so this is set up as a PerlResponseHandler on a virtual server. When the 
handler is invoked, it loads the page (a php page) but doesn't parse it in 
PHP before displaying to the browser. I just get back the raw php code. 
Actually, this is somewhat misleading. On Opera it displays the code. In IE 
it asks me whether or not I want to download it or open it. When I remove 
the handler config in the conf files Apache handles it just as it should (it 
parses the php correctly) but with the handler in there when I go to 
http://localhost/b/b.php I get.

?php

print Hello World;

?

instead of the actual printing out of Hello World.  Here's the results 
from a telnet session:

GET /b/b.php HTTP/1.1
Host: localhost
HTTP/1.1 200 OK
Date: Wed, 20 Aug 2003 19:54:02 GMT
Server: Apache/2.0.47 (Unix) mod_perl/1.99_09 Perl/v5.8.0 mod_ssl/2.0.47 
OpenSSL/0.9.7a DAV/2 PHP/4.3.2
Last-Modified: Wed, 20 Aug 2003 10:45:31 GMT
ETag: 21c229-3f-ef2ea8c0
Accept-Ranges: bytes
Content-Length: 63
Content-Type: application/x-httpd-php

?php

print Hello World;

?
Connection closed by foreign host.
When I've telneted into the the machine and called the file directly it 
seems that the proper content type is being set but in the browser (as 
mentioned above) IE reports the content-type as blank (there's a null value 
in the popup box for content-type when it asks me to download or open the 
file). No errors appear in the log files.

Now, I know it's executing because it actually processes the code below that 
line which ends in a redirect.  But if I try to exit the handler because 
it's not the correct type of request it seems to mess up the proper handling 
of php and html files.  Image files seem to appear correctly but that may be 
the browser compensating or reading the encoding type.

Of course, if anybody has an easier suggestion on what I'm trying to do:

I want the document root directory to be handled by the PerlResponseHandler 
(i.e. Location /). For specific types of requests I would like the above 
handler to look up some information in the database and return the info 
dynamically. The only time I don't want that to happen is if the file 
already exists OR it's pointing to a directory that exists that has a 
default document type in it (ie index.html or index.php). Everything else I 
want the handler to do something special with.

So, for instance, if I have the following files:

/index.html
/somefile.html
/somedir/index.html
/somedir/somefile2.html
Any request to:

http://localhost/
http://localhost/index.html
http://localhost/somefile.html
http://localhost/somedir/
http://localhost/somedir/index.html
http://localhost/somedir/somefile2.html
Should all just be handled normally. But if I get a request for:

http://localhost/abc/def/
or
http://localhost/somedir/abcdef
I want the above perl handler to parse the uri and do
something different (directories /abc/def and /abcdef/ would not
actually exist).
TIA for any help on this.

Bill




DISCLAIMER:  The views expressed by the author of this email may or may not 
be his/her views, the views of any company s/he represents, or the views of 
any persons living, dead, or undead.This email message, including any 
attachments, is for the sole use of the intended recipient(s).  If you have 
received this e-mail in error do not open it.  Any errors in facts, 
spelling, or tact are transmission errors caused by your email client.

http://www.windowsix.com

_
bGet MSN 8/b and help protect your children with advanced parental 
controls.  http://join.msn.com/?page=features/parental



--
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html


Re: Strange mod_perl2/Apache2 behavior -- I think

2003-08-20 Thread Bill Rini
Hi Stas,

Thanks for the help.  Changing it to the PerlTransHandler did the trick.  
The weird part about it (to me) is that I was basically extending some code 
I had written about 2 years ago for a client and just changing the 
functionality around some.  The meat of the program was pretty much the 
same except my datastructures are a little different and instead of 
delegating errors to my own custom code I just wanted them to default to the 
what was configured in the server settings.  Obviously that was on mp1 and 
Apache 1.x but it worked perfectly under that and the only mods I had made 
to the mp2 version was the headers_out stuff and some minor compatibility 
tweaks.  Oh well.  I guess I'll be a little more careful reading the 
mod_perl docs this time around :-)

Thanks,

Bill




DISCLAIMER:  The views expressed by the author of this email may or may not 
be his/her views, the views of any company s/he represents, or the views of 
any persons living, dead, or undead.This email message, including any 
attachments, is for the sole use of the intended recipient(s).  If you have 
received this e-mail in error do not open it.  Any errors in facts, 
spelling, or tact are transmission errors caused by your email client.

http://www.windowsix.com






From: Stas Bekman [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
CC: [EMAIL PROTECTED]
Subject: Re: Strange mod_perl2/Apache2 behavior -- I think
Date: Wed, 20 Aug 2003 14:21:54 -0700
[...]
 Location /b
   PerlSendHeader On
   SetHandler perl-script
   PerlHandler MyApache::MyPackage
 /Location
[...]
I want the document root directory to be handled by the 
PerlResponseHandler (i.e. Location /). For specific types of requests I 
would like the above handler to look up some information in the database 
and return the info dynamically. The only time I don't want that to happen 
is if the file already exists OR it's pointing to a directory that exists 
that has a default document type in it (ie index.html or index.php). 
Everything else I want the handler to do something special with.
You probably want to write a PerlTransHandler, not a response handler if 
you are after dispatching. There are plenty examples in the books, 
perl.apache.org.

If you meant something else, please give us some more meat, your example 
lacks the logic that you may have the problem with.

__
Stas BekmanJAm_pH -- Just Another mod_perl Hacker
http://stason.org/ mod_perl Guide --- http://perl.apache.org
mailto:[EMAIL PROTECTED] http://use.perl.org http://apacheweek.com
http://modperlbook.org http://apache.org   http://ticketmaster.com


--
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html
_
bGet MSN 8/b and enjoy automatic e-mail virus protection.   
http://join.msn.com/?page=features/virus



--
Reporting bugs: http://perl.apache.org/bugs/
Mail list info: http://perl.apache.org/maillist/modperl.html


Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-08-14 Thread Bill Marrs

Try this patch:
[...]
Feel free to submit this bug report and the fix to httpd-dev. Please let 
me know if you do that, so I won't duplicate it. But I'd prefer that you 
do it so you can make sure that it gets fixed in the next release, since 
you need it working.
I've just verified that your patch fixes my problem.

I've never submitted a but report and fix to httpd-dev, but I'm willing to 
do it.

How do I do it?

-bill




Apache:DBI in /perl-status?

2003-08-14 Thread Bill McGonigle
I must be missing something obvious, but I can't get the friendly 
Apache::DBI status item to show up in /perl-status like I read about.

I can see the module loaded at:
 /perl-status/?Apache::DBI
and other modules, e.g. HTML::Mason, have installed their status pages, 
so I'm assuming Apache::Status is OK.

I have in my perl.conf, at the top now:

-
PerlModule Apache::Status
PerlModule Apache::DBI
PerlRequire conf/startup.pl
-
then later:

-
Location /perl-status
  SetHandler  perl-script
  PerlHandler Apache::Status
/Location
-
and in my startup.pl:

-
BEGIN {
use Apache;
}
use Apache::Status;
use Apache::DBI;
-
This is on:   Embedded Perl version v5.8.1 for Apache/1.3.28 (Unix) 
mod_perl/1.28

From what I've read on the list archives and the Apache::DBI man page, 
this setup is about right.

I've run Apache::DBI in debug mode, and it's certainly doing its job.

Does anybody see what I did wrong?

Thanks,
-Bill


Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-08-14 Thread Bill Marrs

Please report to the list the bug id so we can document this issue for 
those who have the same problem with older httpds. Thanks.
OK, I've posted it.

http://nagoya.apache.org/bugzilla/show_bug.cgi?id=22259

Thanks for the fix!

-bill



Re: [Fwd: Call for Participation for ApacheCon US 2003]

2003-07-22 Thread Bill Weinman
At 04:14 AM 7/22/2003, Stas Bekman wrote:
If you would like to be a speaker at the ApacheCon US 2003
event, please go to the ApacheCon Web site, log in, and choose
the 'Submit a CFP' option from the list there:
I went through the process of creating an account only to find that the 
Submit a CFP option isn't there.

Until they get that fixed, you can select Contact Us from the bottom 
menu, then under the cfp@ address there's a link to the online 
submission form.

--Bill



---
  Bill Weinman   http://bw.org/
  Music  http://music.bw.org/
  Whois Client   http://whois.bw.org/
  Music Database http://www.webmusicdb.com/
  --+



Re: Apache 1.3.27 configure error with mod_perl 1.28, perl 5.8.0, gcc 3.3 on Solaris 2.8

2003-07-22 Thread Bill Weinman
At 03:50 PM 7/22/2003, Chris Fabri wrote:
helpers/dummy.c   -lsocket -lnsl -lpthread -W1,-E 
-L/usr/local/lib/gcc-lib/sparc-
..^^^

I think your problem is with the stray comma in the command line there ... 
I think if you check all your configurations and make files and get that 
fixed, it will work (at least get beyond that error).

--Bill



---
  Bill Weinman   http://bw.org/
  Music  http://music.bw.org/
  Whois Client   http://whois.bw.org/
  Music Database http://www.webmusicdb.com/
  --+



Re: Apache 1.3.27 configure error with mod_perl 1.28, perl 5.8.0, gcc 3.3 on Solaris 2.8

2003-07-22 Thread Bill Weinman
At 04:45 PM 7/22/2003, Chris Fabri wrote:
I'm not even getting as far as the make when I get the error.If I 
build mod_perl separately, and remove all references to these flags from 
the makefiles, and the run apache's config, I still get this error during 
configuration.
Where do you find those flags in the makefiles? I don't see them anywhere 
in my copy.

--Bill



---
  Bill Weinman   http://bw.org/
  Music  http://music.bw.org/
  Whois Client   http://whois.bw.org/
  Music Database http://www.webmusicdb.com/
  --+



Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-07-21 Thread Bill Marrs

I can measure it myself if you can provide me with URLs to your resources
and identify them in terms of which one is mod_CGI and which is mod_perl.
This is the mod_cgi one that works fine, no errors:
http://shevek.kenyonhill.com/cgi/test.pl
This is the mod_perl one (same script) that generates the 20014:Error in 
the error_log.  Also, the page doesn't display correctly (it seems to erase 
itself):

http://shevek.kenyonhill.com/perl/test.pl

This is the contents of test.pl:

---
#!/usr/bin/perl
$|=1;
print Content-Type: text/html\n\n;
print hello worldP;
# This line causes the error
print ;
---
Let me know if you need anything more.



Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-07-21 Thread Bill Marrs

We can see that mod_cgi bufferizes the output and sends it with
Content-Length HTTP header (to mod_deflate). Indeed mod_perl generates
chunked response. Finally we have the same result. I don't see any problem
at this moment.
Well, the problem is that I get this error in my error_log:

[Mon Jul 21 14:18:55 2003] [error] 4297: ModPerl::RegistryBB: 20014:Error 
string not specified yet at /var/www/perl/test.pl line 6.

Also, more important, the script seems to be terminating and/or any output 
following the 'print ' is not sent to the client as far as I can tell.

You would probably wish to append your script with
additional output after the empty string? Something like:
#!/usr/bin/perl
$|=1;
print Content-Type: text/html\n\n;
print hello worldP;
# This line causes the error (?)
print ;
print hello againP;
---
When I do this, the mod_perl variant of the script fails to print hello 
againP.

mod_cgi prints everything just fine and gets no errors.

I changed my test script to print a bunch of `date`'s
http://shevek.kenyonhill.com/cgi/test.pl
http://shevek.kenyonhill.com/perl/test.pl
It may cause a problem for chunked output if mod_deflate does not care to
keep internal buffer and check its size when flushing...
I may not be understanding the output you sent or what you're saying, but I 
still don't follow why this would be a mod_deflate bug if mod_cgi with the 
same script has no problem.





Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-07-15 Thread Bill Marrs
At 04:24 AM 7/15/2003, Stas Bekman wrote:
Philippe M. Chiasson wrote:
On Thu, 2003-07-03 at 01:24, Bill Marrs wrote:

This fixed the bug for me.
Great! Will commit it in the near future. (Can't seem to access the cvs
server right now, crappy internet cafe)
-1, this is a wrong solution. print ; should flush just like it did in 
mod_perl 1.0 if $| == 1; Consider this script:

print Content-type: text/plain\n\n;
local $| = 0;
print start\n\n;
local $| = 1;
print ;
sleep 5;
local $| = 0;
print end\n;
print , must immediately flush the buffered data, since $| has changed 
from 0 to 1.
This may be naive, but might it not flush the output buffer at the 4th line 
(local $| = 1;)?  ...or does the flush only happen when print is 
called.  Having to call print  seems cumbersome to do a flush, but maybe 
that's just the way Perl works?

One thing that could help is if someone could take the time to write a
test for this bug.
Unfortunately I don't seem to be able to reproduce the problem, so I can't 
debug the problem. It could be a bug on the mod_deflate's behalf as well. 
Philippe, were you able to reproduce the problem with Bill's setup? I was 
writing a test, but couldn't get it to fail... may be because i was using 
2.0.47. Bill, do you have the problem with the latest apache version?
Did you see my note in the original post about this working fine under 
mod_cgi, but causing the 20014:Error only under mod_perl?  This seemed to 
point the finger squarely at mod_perl.

I could upgrade to 2.0.47, but it seems unlikely that it would fix 
this.  Are you sure you're running a mod_perl without Philippe's fix (in 
Apache__RequestIO.h), I assumed he eventually checked it in.

Also Bill, why do you have this setup:

Location /perl
  AddOutputFilterByType DEFLATE text/*
  SetOutputFilter DEFLATE
/Location
why adding it twice? You need only the latter inside Location, or 
alternatively only the former outside Location if you want it to be set 
globally:
http://httpd.apache.org/docs-2.0/mod/core.html#addoutputfilterbytype
Ah, I misunderstood the mod_deflate docs.  I think at the time, it didn't 
seem to work with just one of them in-place, so I added the 
other.  *SHRUG*  I can't say I'm a pro at Apache config files, I just 
tinker until it works.  I assume this is irrelevant to the bug, though.

-bill





Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-07-15 Thread Bill Marrs

I could upgrade to 2.0.47, but it seems unlikely that it would fix 
this.  Are you sure you're running a mod_perl without Philippe's fix (in 
Apache__RequestIO.h), I assumed he eventually checked it in.
No, Philippe hasn't committed it, neither I have used it. If you can test 
with 2.9.47 that will help. Otherwise I'll later try with .46 as well.
I just upgraded to Apache 2.0.47 and the latest CVS of mp2 and I'm 
reproducing it, same as I originally reported.

I tried a few variations to see if I could find other factors, but didn't 
have much luck.  The server I'm testing on is live (runs a small site), so 
I'm somewhat limited in what I can do.  But, I tried removing mod_rewrite, 
and mod_ssl from my server config and I still got the error in my tests.

I also tried varying the mod_deflate config (as you pointed out it was 
redundant/wrong)...

Also Bill, why do you have this setup:

Location /perl
  AddOutputFilterByType DEFLATE text/*
  SetOutputFilter DEFLATE
/Location
why adding it twice? You need only the latter inside Location, or 
alternatively only the former outside Location if you want it to be 
set globally:
http://httpd.apache.org/docs-2.0/mod/core.html#addoutputfilterbytype
Ah, I misunderstood the mod_deflate docs.  I think at the time, it didn't 
seem to work with just one of them in-place, so I added the 
other.  *SHRUG*  I can't say I'm a pro at Apache config files, I just 
tinker until it works.  I assume this is irrelevant to the bug, though.
I'm not sure if your config doesn't insert the filter twice. Need to check 
whether SetOutputFilter overrides AddOutputFilterByType as well.
In my config, I have a deflate log:

DeflateFilterNote Input instream
DeflateFilterNote Output outstream
DeflateFilterNote Ratio ratio
LogFormat '%r %{outstream}n/%{instream}n (%{ratio}n%%) %{User-agent}i' 
deflate
CustomLog /var/log/httpd/deflate_log deflate

I was using this to verify that deflate was active or not.  What I found 
was that using AddOutputFilterByType DEFLATE text/* doesn't seem to 
activate deflate, no matter where I put it (in a Location or at the 
top-level of the config).  My deflate log shows that no compression is 
occurring.  SetOutputFilter DEFLATE does activate deflate globally (for 
all locations), no matter where I put it.  I'm afraid I don't find the 
mod_deflate docs very clear on placement.

When you were trying to reproduce, I don't know if you used my (redundant) 
config before, but if you only used AddOutputFilterByType DEFLATE text/*, 
and not SetOutputFilter DEFLATE, that might explain why you were not able 
to reproduce this, I don't think that activates deflate (at least that's 
what my deflate_log shows for me).

Otherwise, if you're still not reproducing this, I would assume there's 
some difference in the way we are building apache or our apache config 
files or perl?

Here's my Apache Configure line:

./configure --enable-modules=all --enable-mods-shared=all --enable-deflate 
--with-mpm=prefork --enable-rewrite --enable-ssl

Here are the Modules I load:

LoadModule cgi_module modules/mod_cgi.so

LoadModule access_module modules/mod_access.so
LoadModule log_config_module modules/mod_log_config.so
LoadModule env_module modules/mod_env.so
LoadModule expires_module modules/mod_expires.so
LoadModule headers_module modules/mod_headers.so
LoadModule setenvif_module modules/mod_setenvif.so
LoadModule mime_module modules/mod_mime.so
LoadModule autoindex_module modules/mod_autoindex.so
LoadModule asis_module modules/mod_asis.so
LoadModule info_module modules/mod_info.so
LoadModule vhost_alias_module modules/mod_vhost_alias.so
LoadModule dir_module modules/mod_dir.so
LoadModule alias_module modules/mod_alias.so
LoadModule rewrite_module modules/mod_rewrite.so
LoadModule perl_module modules/mod_perl.so
LoadModule ssl_module modules/mod_ssl.so
mod_cgi, mod_ssl and mod_rewrite seem to be innocent (when I removed them, 
it still failed), but maybe you're using a module that I'm not and this is 
causing your config to work better?   I think that I'm loading less modules 
than what is typical, I removed as many as I could to reduce memory usage a 
while ago.

*SHRUG*

One thing that may be relevant is tracking down where the 20014:Error is 
coming from.  Web searching seems to associate it with Berkeley DB code, 
which seems odd to me, is Apache using Berkeley DB code somewhere (via a 
filter?)?

This isn't a very important issue for me.  I have a decent workaround 
(print   instead of ), plus I'm not really able to use mod_deflate that 
much anyway because it puts too much load on my server (mod_gzip with 
Apache 1.3 worked better for me).

Thanks

-bill








Re: Memoize.pm and mod_perl

2003-07-14 Thread Bill Marrs
I don't know anything about Memoize, but perhap db-level caching would work 
for you?

If you user MySQL, Mysql 4.0.1 and beyond has Query Caching capabilities 
built into it.

http://www.mysql.com/documentation/mysql/bychapter/manual_Reference.html#Query_Cache

-=bill



[mp2] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-07-02 Thread Bill Marrs
When I use Apache 2.0.46, mod_deflate with mod_perl-1.99_09 (or the latest 
mod_perl-2.0 CVS), perl buffering is off ($|=1), and my perl script prints 
nothing (e.g. 'print ;'), I get the following error:

[Wed Jul 02 10:10:00 2003] [error] 19513: ModPerl::RegistryBB: 20014:Error 
string not specified yet at /var/www/perl/test.pl line 6.

If I switch to running the script under mod_cgi or if I remove the $|=1; 
line, I do not get an error.

Here is my script:

#!/usr/bin/perl
$|=1;
print Content-Type: text/html\n\n;
print hello worldP;
# This line causes the error
print ;
httpd.conf snipit:

Alias /perl/ /var/www/perl/
Location /perl
  AddOutputFilterByType DEFLATE text/*
  SetOutputFilter DEFLATE
  AllowOverride  None
  SetHandler perl-script
  PerlHandlerModPerl::RegistryBB
  PerlSendHeader On
  Options+ExecCGI
/Location
I've worked-around this problem by changing my print  to print  .  It's 
not a major issue for me, I'm just letting you know.  Let me know if you 
need any more info.

-bill



Re: [mp2 Patch] BUG with mod_deflate and $|=1 (20014:Error string not specified)

2003-07-02 Thread Bill Marrs
This fixed the bug for me.

At 10:48 AM 7/2/2003, you wrote:
 #define mpxs_output_flush(r, rcfg) \
 /* if ($|) */ \
-if (IoFLUSH(PL_defoutgv)) { \
+if (bytes  0  IoFLUSH(PL_defoutgv)) { \
 MP_FAILURE_CROAK(modperl_wbucket_flush(rcfg-wbucket, TRUE)); \
 }



Best compression for mod_perl application?

2003-07-01 Thread Bill Marrs
I used to use mod_gzip under Apache 1.3, and it worked great, saving my 
over 50% of my bandwidth for my mod_perl generated pages.

But, it appears that mod_gzip doesn't work with Apache 2.  Apache 2 has a 
built-in mod_deflate, but I've had some trouble with it (seemed to cause a 
load spike on my server + errors if I print ).

I recall there used to be alternatives to mod_gzip out there, but I'm not 
sure if they apply to Apache 2.

Are any of you use compression on your mod_perl pages?

Do you recommend any compression schemes for the Apache 2/mp2 environment?

TIA,

-=bill



Re: Best compression for mod_perl application?

2003-07-01 Thread Bill Marrs

That would be of my real interest to know as many details of Bill's
experience with mod_deflate as he can provide.
Since I posted my first message, I've been snooping around the 'net to find 
more info on mod_gzip and mod_deflate.  Here's what I came up with:

The general recommendation seems to be migration from mod_gzip to 
mod_deflate when you switch to Apache 2.0.  mod_gzip seems to have lost 
most of its support going forward while mod_deflate is part of the Apache 
source code and has active development.

There is a Apache 2.0 compatible version of mod_gzip, here:
http://www.gknw.de/development/apache/httpd-2.0/unix/modules/
When I tried it, it didn't work for me.  It caused my site to spit out 
blank pages and garbage.  I had used my old Apache 1.3 mod_gzip config with 
it.  I read that there's some odd timing issues where the Apache 2.0 
version of mod_gzip branched a long time ago and thus doesn't have some of 
the modern mod_gzip 1.3.x features.  I didn't get config errors, though, 
just blanks and garbage.  So, I decided to back away slowly for mod_gzip on 
Apache 2.0.  There is more discussion of it here:
http://freshmeat.net/projects/mod_gzip/?topic_id=90

There a good mod_gzip info page here, though little is said about a 2.0 
version:
http://www.schroepl.net/projekte/mod_gzip/index.htm

The mod_gzip mailing list has some good info.  Here's a 26 Jun 2003 post by 
someone who seems to know well what's going on (I think the author of the 
above page):
Subject: [Mod_gzip] gzip vs deflate on Apache
http://lists.over.net/pipermail/mod_gzip/2003-June/007130.html

So, I decided to try harder to move ahead with mod_deflate.  I'm using a 
built from scratch, Apache/2.0.46 mod_perl_1.99_09.  Work is being done on 
mod_deflate, some recent directives has been added (I hear).  One of which 
is DeflateCompressionLevel.  Along with this addition in 2.0.44 came a 
better default for this compression level.  It's now 6, the same thing that 
gzip and zlib uses by default.  Apparently, it had been 1 before that, 
which is fast but doesn't compress very well.  There's some discussion of 
this here:
http://www.webcompression.org/deflate-compress.html

My own personal experience with mod_deflate (in Apache/2.0.46) is that it 
tends to spike my server's load.  My server (gametz.com) is dual 800Mhz, 
1.5GB ram, Linux, doing about 70K pages/day.  Last night, I happened to be 
watching it while the load jumped up a few points during my site's prime 
time, so I pulled mod_deflate out of the config file and that fixed it.

So, today, I'm trying a lower DeflateCompressionLevel.  I'm using 4 now 
(instead of the default 6).  This seems better, though the load is still a 
little higher than it should be and I'm not quote at prime time 
yet.  Still, I am getting decent compression.  I'm going to keep an eye on 
it, I suspect I'll be at 3 later this evening.

I never had any trouble with load when I used mod_gzip and Apache 1.3.

The other odd problem I got was that if anywhere in my perl code I printed 
nothing (e.g. print  or $foo=;print $foo), I'd get this error:

error: 20014:Error string not specified yet at /my/perl/code.pl line 123

This error was both blurted to the error_log and to the web page (screwing 
up the page and truncating further output).

I changed my code to print   instead of  (HTML ignores extra 
white-space, so no biggie), and the errors all went away.  So, I see this 
as an annoyance more than a serious bug.

I really should try to tell the author of mod_deflate about these problems.

Here's the config I'm using for mod_deflate:

#
## Deflate
#
LoadModule deflate_module modules/mod_deflate.so
AddOutputFilterByType DEFLATE text/*
SetOutputFilter DEFLATE
# Make sure proxies don't deliver the wrong content
Header append Vary User-Agent env=!dont-vary
DeflateBufferSize 8096
# DeflateCompressionLevel 6
DeflateCompressionLevel 4
DeflateMemLevel 9
DeflateWindowSize 15
DeflateFilterNote Input instream
DeflateFilterNote Output outstream
DeflateFilterNote Ratio ratio
LogFormat '%r %{outstream}n/%{instream}n (%{ratio}n%%) %{User-agent}i' 
defl\
ate
CustomLog /var/log/httpd/deflate_log deflate

All of which I cribbed from the Apache 2.0 manual:
http://httpd.apache.org/docs-2.0/mod/mod_deflate.html
I sort of got forced into upgrading to Apache 2.0/mp2/etc. by RedHat.  They 
announced they would pull support for old releases (all that used Apache 
1.3) by the end of the year.  Apparently, this may be an intentional 
(evil?) business move by them to motivate more customers to move to their 
Enterprise OS (which is very expensive, but has more stable software like 
good old Apache 1.3  mp1).

I did try to go back at one point, builing Apache 1.3 from source, but it 
had some other problem (maybe, because I used Perl 5.8.0?).  But, then I 
waffled and decided there's also a lot of value in staying current.  So, 
I'm back to 2.0 land, and I'm surviving so far.

-=bill










Re: Best compression for mod_perl application?

2003-07-01 Thread Bill Marrs

1. Are you using any Cascaded Stile Sheets and/of JavaScript libraries
linked to your main web pages?
I'm not, but... I think mod_defalte's hook is after all that is processed, 
so it's not especially relevant.

2. If yes, how do you turn compression off for those files in case of
Netscape-4 originated request?
http://httpd.apache.org/docs-2.0/mod/mod_deflate.html
says to use this sort of thing:
# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html
# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4\.0[678] no-gzip
# MSIE masquerades as Netscape, but it is fine
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
# Don't compress images
SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary
# Make sure proxies don't deliver the wrong content
Header append Vary User-Agent env=!dont-vary





submit input truncation

2003-06-25 Thread Bill Marrs
I recently moved to Apache 2 + mod_perl2 and things have been working 
fairly well.

But, I'm getting an intermittent problem with POSTs where the input data is 
being truncated.  This causes havoc, especially in my forum system.

I use CGI.pm, mod_perl 2, Apache 2, and the standard RH9 Perl (which is 
threaded) - I think that's all the relevant stuff.

Has anyone else seen this?  Is there some fix for it?

I'm not even sure where the problem is yet.

Thanks in Advance.

-bill



Re: submit input truncation

2003-06-25 Thread Bill Marrs
At 04:48 PM 6/25/2003, Haroon Rafique wrote:
I think you should take a look at the list archives for this thread:
http://mathforum.org/epigone/modperl/bindondrei
Thank You!

This is very likely the problem I'm having.

Secondly, you should also post the version numbers for all of the
above-mentioned packages. As mentioned in one of the emails this was fixed
in mod_perl 1.99_09.
I'm using what comes with RH9:

httpd-2.0.40-21.3
mod_perl-1.99_07-5
perl-5.8.0-88
plus, this which is the latest from cpan:
CGI.pm-2.97
Clearly, the first thing I should try is mod_perl 1.99_09.

Now, I'm wondering if someone has made a RH9 friendly mod_perl-1.99_09 rpm...

As always, I'm trying desperately to avoid rebuilding 
apache/perl/mod_perl/etc. on my systems.

-bill



mod_perl-1.99_09 for Redhat 9

2003-06-25 Thread Bill Marrs
I'm looking for a Redhat 9 compatible mod_perl-1.99_09 rpm.

If anyone has one or knows where I can get one, let me know.

Thanks,

-bill

p.s.  I did find a Rawhide (bleeding edge Red Hat release, I think) 
mod_perl-1.99_09, but it doesn't seem to be compatible (I got an error from 
Apache).



2nd perl install?

2003-06-25 Thread Bill Marrs
If I can't find the mod_perl rpm I need, it's looking like I might need to 
build it from source.  I believe this would mean also building Perl and 
Apache from source.

Now, I have a redhat-installed Perl already.  From past discussions, the 
best idea seemed to be to create a 2nd perl installation for 
Apache/mod_perl to use, perhaps located in /usr/local.

The thing I'm fuzzy on is how to tell my mod_perl programs to use this 
mod_perl.  Is it enough to have the LoadModule line point to my 
build-from-scratch mod_perl.so?

Also, if I want to use this 2nd perl for scripts, is it as simple as 
changing the top line to #!/usr/local/bin/perl.  Do I have to worry about 
library pathes or environment variables?

I assume I'll need to keep separate lib/perl5 trees as well and I'll need 
to install the various cpan modules I use twice.

Any tips would be appreciated.

-bill



Re: MaxRequestsPerChild; which request am I?

2003-04-04 Thread Bill Moseley
On Fri, 4 Apr 2003, Brian Reichert wrote:

   In messing with Apache 1.x, is there a way, via mod-perl, of a
   request knowing how many requests have been served by the current
   child?
  
  
  $request++;
  
  That's what I do in some handler, and then I log it along with the PID.
 
 Eh?  I'm confused.  What is '$request' in that example?  If you
 mean it's the request object, then that doesn't do what I expect.

No, it's a simple counter.  It's just a variable in some module that
counts requests.






-- 
Bill Moseley [EMAIL PROTECTED]



Re: MaxRequestsPerChild; which request am I?

2003-04-03 Thread Bill Moseley
On Fri, 4 Apr 2003, Brian Reichert wrote:

 Dunno if someone has a good answer, or a suggestion of a better
 forum for this:
 
 Apache has a configuration directive: MaxRequestsPerChild
 
   http://httpd.apache.org/docs/mod/core.html#maxrequestsperchild
 
 In messing with Apache 1.x, is there a way, via mod-perl, of a
 request knowing how many requests have been served by the current
 child?


$request++;

That's what I do in some handler, and then I log it along with the PID.




-- 
Bill Moseley [EMAIL PROTECTED]



Re: Basic Auth logout

2003-03-07 Thread Bill Moseley
On Fri, 7 Mar 2003, Francesc Guasch wrote:

 this has been asked before, and I've found in the archives
 there is no way I could have a logout page for the Basic
 Auth in apache.
 
 Is there nothing I can do ? This is required only for the
 development team, so we need to let mozilla or IE  forget
 about the username and password.

It all depends on the browser and version.  I have been able to logout
some versions of IE by having a link to another protected resource of the
same auth name but different username and password (in the link).

You are just better maintaining a session on the server.

-- 
Bill Moseley [EMAIL PROTECTED]



Re: Authorization question

2003-02-27 Thread Bill Moseley
On Thu, 27 Feb 2003, Perrin Harkins wrote:

 Jean-Michel Hiver wrote:
  Yes, but you're then making the authorization layer inseparable from
  your applicative layer, and hence you loose the interest of using
  separate handlers.
 
 It's pretty hard to truly separate these things.  Nobody wants to use 
 basic auth, which means there is a need for forms and handlers.  Then 
 you have to keep that information in either cookies or URLs, and there 
 is usually a need to talk to an external data database with a 
 site-specific schema.  The result is that plug and play auth schemes 
 only work (unmodified) for the simplest sites.

Anyone using PubCookie?

http://www.washington.edu/pubcookie/

-- 
Bill Moseley [EMAIL PROTECTED]



Is Sys::Signal still needed?

2003-02-01 Thread Bill Moseley
Searching the archives I don't see much discusson of Sys::Signal.  Is it
still needed to restore sig handlers?

Thanks,


-- 
Bill Moseley [EMAIL PROTECTED]




Re: web link broken when access cgi-bin

2002-12-22 Thread Bill Moseley
On Sunday 22 December 2002 03:49, Ged Haywood wrote:
 Hi there,
 
 On Sat, 21 Dec 2002, eric lin wrote:
 
  The image file:///home/enduser/mytest.jpg cannot be displayed, because 
  it contains errors
 
 I think I understand your question but I am not sure of it.
 
 It seems that you have sent a request to Apache, received a response,

And sent messages about using Windows to a Linux list, and CGI questions to 
mod_perl list and seems to ignore the many requests to read some basic CGI 
tutorials.   I'd guess troll if he wasn't so clueless. ;)



Re: web link broken when access cgi-bin

2002-12-22 Thread Bill Moseley
On Sun, 22 Dec 2002, Richard Clarke wrote:

  And sent messages about using Windows to a Linux list, and CGI questions
 to
  mod_perl list and seems to ignore the many requests to read some basic CGI
  tutorials.   I'd guess troll if he wasn't so clueless. ;)
 
 Since when did mod_perl becomes Linux only?

oops, I meant to write:

And sent messages about using Windows to a Linux list


-- 
Bill Moseley [EMAIL PROTECTED]




Re: Fw: OT - Santa uses PERL

2002-12-20 Thread Bill Moseley
At 11:17 AM 12/20/02 +0200, Issac Goldstand wrote: 
>>>>
http://www.perl.com/pub/a/2002/12/18/hohoho.html>http://www.perl.com/pub/a/2002/12/18/hohoho.html 


That sounds a lot like Perrin's story.  Didn't he save Christmas one year?



-- 
Bill Moseley
mailto:[EMAIL PROTECTED] 

Can't get nested files to work in Perl section

2002-12-19 Thread Bill Moseley
mod_perl 1.27 / httpd 1.3.27

In the perl httpd.conf below test.cgi is returned as the default type,
text/plain, where test2.cgi is run as a CGI script.

Do I have this setup incorrectly?

In a standard httpd.conf file it's allowable to have files nested within
directory, of course.

 Perl
 #!perl
 $User = 'nobody';
 $Group = 'users';
 $ServerRoot = '/home/moseley/test';
 $TypesConfig = '/dev/null';
 $Listen = '*:8000';

 $VirtualHost{'*:8000'} = {
ServerName   = 'foo',
DocumentRoot = '/home/moseley/test',
ErrorLog = 'logs/error_log.8000',
TransferLog  = 'logs/error_log.8000',

Files = {
'test2.cgi' = {
Options = '+ExecCGI',
SetHandler  = 'cgi-script',
},
},


Directory = {
'/home/moseley/test' = {
Allow   = 'from all',
Files = {
'test.cgi' = {
Options = '+ExecCGI',
SetHandler  = 'cgi-script',
},
},
},
},
 };
 

 __END__


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



[OT] Ideas for limiting form submissions

2002-12-18 Thread Bill Moseley
I've got a mod_perl feed-back form that sends mail to a specific address..
Spammers have their bots hitting the form now.  The tricks I know of are:

- generate a random image of numbers and make the user type in the numbers
on the form.  Painful for the user and spammers probably have OCR!

- require an email and send a confirmation email (like a list
subscription) and whitelist some email addresses.  But we want to allow
anonymous submissions.

- limit submissions by IP number to one every X minutes.  AOL users may
get blocked.

- md5 the submission and block duplicates (should do this anyway).  BTW --
what would you recommend for caching the md5 strings.  Cache::Cache or
DBM?  I suppose a Cache::Cache file cache would be the easiest.

Any other ideas on the easy to implement side?



-- 
Bill Moseley [EMAIL PROTECTED]





Re: [OT] Ideas for limiting form submissions

2002-12-18 Thread Bill Moseley
At 02:51 PM 12/18/02 -0500, Daniel Koch wrote:
Check out Gimpy, which I believe is what Yahoo uses:

http://www.captcha.net/captchas/gimpy/

I'm thinking of something along those lines.  This problem is this is on
Solaris 2.6 w/o root, and I'll bet it would take some time to get The Gimp
and GTK and whatever libs installed.

So, I'm thinking about creating a directory of say 20 images of words.  On
the initial request the form creates a random key, and makes that a symlink
to one of the images selected at random.  That will be the img src link.

Then md5 the symlink with a secret word to create a hidden field.

The submitter will have to type in the word displayed in the image.

On submit md5 all the symlinks with the secret word until a match is found
-- match the submitted word text with the real image name, then unlink the
symlink and accept the request.

Cron can remove old symlinks.

If the spammers put in the work to figure out the word by check-summing the
images I can use imagemagic to modify the images -- that could be a nice
mod_perl handler.

See any glaring holes? 

-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



You can also use ps ...

2002-12-14 Thread Bill Drury

In linux:  ps -axl | grep http

... will show you process sizes.  Doing it without the grep will show you 
the column headers in the first line.




RE: Cookie-free authentication

2002-12-13 Thread Bill Moseley
On Sat, 14 Dec 2002, Ron Savage wrote:

 Under Apache V 1/Perl 5.6.0 I could not get the Apache::AuthCookieURL 
 option working which munged URLs without requiring cookies.

I thought the problem was that Apache::AuthCookie was redirecting to your
login scrip on logout instead of displaying your logout page.


-- 
Bill Moseley [EMAIL PROTECTED]




[mp2]: Problem running scripts with Apache::compat and PerlRun.

2002-12-06 Thread Bill Drury

-8-- Start Bug Report 8--
1. Problem Description:

I've been getting a consistent error attempting to run scripts with 
PerlRun and the Apache::compat layer.  It may just be a configuration 
error, but I'd appreciate anything you could tell me.

The error:

[Fri Dec 06 01:47:27 2002] [error] [client 127.0.0.1] Use of uninitialized
value in concatenation (.) or string at
/usr/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi/Apache2/Apache/compat.pm
line 257.


The httpd.conf:

LoadModule perl_module modules/mod_perl.so
PerlModule Apache2
PerlModule Apache::compat
PerlModule Apache::Status

Location /perl-status
SetHandler perl-script
PerlHandler Apache::Status
Order Allow,Deny
Allow from All
/Location

... later in a VirtualHost:

Alias /perl-bin/ /home/category/cgi/
Location /perl-bin/
SetHandler perl-script
PerlHandler Apache::PerlRun
Options +ExecCGI
PerlSendHeader on
/Location

... now I can get the perl-status page, so I know it's at least working in 
a minimal way.  But whenever I try to run one of the .cgi scripts in 
/home/category/cgi, I get that error above.

I'm running the CVS version, version 1.99_07 gave me the same error, but 
on line 250 of compat.pm rather than line 257.  Am I doing something 
wrong?



2. Used Components and their Configuration:

*** using lib/Apache/BuildConfig.pm
*** Makefile.PL options:
  MP_AP_PREFIX= /home/httpd
  MP_GENERATE_XS  = 1
  MP_INST_APACHE2 = 1
  MP_LIBNAME  = mod_perl
  MP_USE_DSO  = 1
  MP_USE_STATIC   = 1


*** /home/httpd/bin/httpd -V
Server version: Apache/2.0.40
Server built:   Sep 20 2002 15:43:46
Server's Module Magic Number: 20020628:0
Architecture:   32-bit
Server compiled with
 -D APACHE_MPM_DIR=server/mpm/prefork
 -D APR_HAS_SENDFILE
 -D APR_HAS_MMAP
 -D APR_HAVE_IPV6
 -D APR_USE_SYSVSEM_SERIALIZE
 -D APR_USE_PTHREAD_SERIALIZE
 -D SINGLE_LISTEN_UNSERIALIZED_ACCEPT
 -D APR_HAS_OTHER_CHILD
 -D AP_HAVE_RELIABLE_PIPED_LOGS
 -D HTTPD_ROOT=/home/httpd
 -D SUEXEC_BIN=/home/httpd/bin/suexec
 -D DEFAULT_PIDLOG=logs/httpd.pid
 -D DEFAULT_SCOREBOARD=logs/apache_runtime_status
 -D DEFAULT_LOCKFILE=logs/accept.lock
 -D DEFAULT_ERRORLOG=logs/error_log
 -D AP_TYPES_CONFIG_FILE=conf/mime.types
 -D SERVER_CONFIG_FILE=conf/httpd.conf


*** /usr/bin/perl -V
Summary of my perl5 (revision 5.0 version 8 subversion 0) configuration:
  Platform:
osname=linux, osvers=2.4.19-2mdkenterprise, archname=i386-linux-thread-multi
uname='linux no.mandrakesoft.com 2.4.19-2mdkenterprise #1 smp tue aug 13 00:17:42 
cest 2002 i686 unknown unknown gnulinux '
config_args='-des -Darchname=i386-linux -Dcc=gcc -Doptimize=-O3 
-fomit-frame-pointer -pipe -mcpu=pentiumpro -march=i586 -ffast-math 
-fno-strength-reduce -Dprefix=/usr -Dvendorprefix=/usr -Dsiteprefix=/usr -Dman3ext=3pm 
-Dcf_by=MandrakeSoft -Dmyhostname=localhost -Dperladmin=root@localhost -Dd_dosuid 
-Ud_csh -Duseshrplib -Dusethreads'
hint=recommended, useposix=true, d_sigaction=define
usethreads=define use5005threads=undef useithreads=define usemultiplicity=define
useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=undef use64bitall=undef uselongdouble=undef
usemymalloc=n, bincompat5005=undef
  Compiler:
cc='gcc', ccflags ='-D_REENTRANT -D_GNU_SOURCE -fno-strict-aliasing 
-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -I/usr/include/gdbm',
optimize='-O3 -fomit-frame-pointer -pipe -mcpu=pentiumpro -march=i586 -ffast-math 
-fno-strength-reduce',
cppflags='-D_REENTRANT -D_GNU_SOURCE -fno-strict-aliasing -I/usr/include/gdbm'
ccversion='', gccversion='3.2 (Mandrake Linux 9.0 3.2-1mdk)', gccosandvers=''
intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=1234
d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=12
ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t', lseeksize=8
alignbytes=4, prototype=define
  Linker and Libraries:
ld='gcc', ldflags =' -L/usr/local/lib'
libpth=/usr/local/lib /lib /usr/lib
libs=-lnsl -lndbm -lgdbm -ldl -lm -lpthread -lc -lcrypt -lutil
perllibs=-lnsl -ldl -lm -lpthread -lc -lcrypt -lutil
libc=/lib/libc-2.2.5.so, so=so, useshrplib=true, libperl=libperl.so
gnulibc_version='2.2.5'
  Dynamic Linking:
dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags='-rdynamic 
-Wl,-rpath,/usr/lib/perl5/5.8.0/i386-linux-thread-multi/CORE'
cccdlflags='-fpic', lddlflags='-shared -L/usr/local/lib'


Characteristics of this binary (from libperl): 
  Compile-time options: MULTIPLICITY USE_ITHREADS USE_LARGE_FILES PERL_IMPLICIT_CONTEXT
  Built under linux
  Compiled at Sep  6 2002 23:24:44
  %ENV:

PERLLIB=/home/category/lib/perl5/site_perl/5.8.0/i386-linux-thread-multi:/home/category/lib/perl5/site_perl/5.8.0
PERL_LWP_USE_HTTP_10=1
  @INC:

Re: Yahoo is moving to PHP ??

2002-10-30 Thread Bill Moseley
At 02:50 PM 10/30/02 -0500, Perrin Harkins wrote:
Mithun Bhattacharya wrote:

No it is not being removed but this could have been a very big thing
for mod_perl. Can someone find out more details as to why PHP was
preferred over mod_perl it cant be just on a whim.


Think about what they are using it for.  Yahoo is the most extreme 
example of a performance-driven situation.

I also wonder if it's cheaper/easier to hire and train PHP programmers that
Perl programmers.


-- 
Bill Moseley
mailto:moseley;hank.org



RE: [OTish] Version Control?

2002-10-30 Thread Bill Moseley
At 04:47 PM 10/30/02 -0500, Jesse Erlbaum wrote:
Web development projects can map very nicely into CVS.  We have a very
mature layout for all web projects.  In a nutshell, it boils down to this:

   project/
 + apache/
 + bin/

That requires binary compatibility, though.  I have a similar setup, but
the perl and Apache are built separately on the target machine since my
machines are linux and the production machine is Solaris.

I only work on single servers, so things are a bit easier.  I always cvs co
to a new directory on the production machine and start up a second set of
servers on high ports.  That lets me (and the client) test on the final
platform before going live.  Then it's apache stop  mv live old  mv
new live  apache start kind of thing, which is a fast way to update.

I'd love to have the Perl modules in cvs, though.  Especially mod_perl
modules.  It makes me nervous upgrading mod_perl on the live machine's perl
library.  Should make more use of PREFIX, I suppose.

Speaking of cvs, here's a thread branch:

I have some client admin features that they update via web forms -- some
small amount of content, templates, and text-based config settings.  I
currently log a history of changes, but it doesn't have all the features of
cvs.

Is anyone using cvs to manage updates made with web-based forms?



-- 
Bill Moseley
mailto:moseley;hank.org



Re: [OTish] Version Control?

2002-10-30 Thread Bill Moseley
At 03:21 PM 10/30/02 -0800, [EMAIL PROTECTED] wrote:
We check in all of our perl modules into CVS and its  a 
_MAJOR_ life saver. Keeps everyone on the same path so to 
speak.

I think I confused two different things: perl module source vs. installed
modules.  Do you check in the source or the installed modules?

I keep the source of my perl modules under cvs, but not the perl library
i.e. the files generated from make install, which might include binary
components.

I use a PREFIX for my own modules, but I tend to install CPAN modules in
the main perl library.  My own modules get installed in the application
directory tree so that there's still a top level directory for the entire
application/site.

It does worry me that I'll update a CPAN module (or Apache::*) in the main
Perl library and break something some day.  (Although on things like
updating mod_perl I have copied /usr/local/lib/perl5 before make install.)


-- 
Bill Moseley
mailto:moseley;hank.org



Re: libapreq-1.0 Seg Faults

2002-09-06 Thread Bill

Sorry, this bounced from my Mac.com acct :P


On Friday, September 6, 2002, at 12:50 PM, William C (Bill) Jones wrote:

 This is a USELARGEFILES support issue.

 On Friday, September 6, 2002, at 12:16 PM, ODELL, TODD E (SWBT) wrote:

 ...
 Apache::Request it gives a 'segmentation fault (11)' in the error_log.


 Here, in Perl, it is defined:

 useperlio=3Dundef d_sfio=3Dundef uselargefiles=3Ddefine =
 usesocks=3Dundef

 And here:

 Characteristics of this binary (from libperl):
   Compile-time options: USE_LARGE_FILES


 To avoid this issue please try the following:

 Leave PERL alone!

 Rebuild mod_perl -- for example:

 perl Makefile.PL \
 USE_APXS=3D1 \
 WITH_APXS=3D/usr/local/apache/bin/apxs \
 EVERYTHING=3D1 \
 USE_DSO=3D1

 #  build mod_php -
 ./configure --with-apxs=3D/usr/local/apache/bin/apxs \
 =A0 --enable-force-cgi-redirect \
 =A0 --enable-discard-path \
 =A0 --with-pear \
 =A0 --enable-safe-mode \
 =A0 --with-openssl \
 =A0 --enable-bcmath \
 =A0 --with-bz2 \
 =A0 --with-gdbm \
 =A0 --with-ndbm \
 =A0 --enable-dio \
 =A0 --enable-ftp \
 =A0 --with-ldap \
 =A0 --with-mysql=3D/usr/local/ \
 =A0 --with-pgsql \
 =A0 --enable-memory-limit

 # You may wish to remove those options which you might not already 
 have=20=

 installed - like maybe pgsql, openssl, or ldap...

 Please correct for your filesys layout.

 Also, you may wish to see http://www.apachetoolbox.com/

 HTH/Sx :]





mod_perl-based registration programs?

2002-06-13 Thread Bill Moseley

Before I start rewriting...

Anyone know of a mod_perl based program for registering people for events?

The existing system allows people to sign up and cancel for classes and
workshops that are offered at various locations and also for on-line
classes.  We have a collection of training workshops that are each offered
a number of times a year, and are taught by a pool of instructors.
Typically a few classes a week.

It mails reminders a few days before the classes, sends class lists to the
assigned instructor before their class, and normal database stuff for
displaying, searching and reporting. Currently, billing is by invoice, but
we would like an on-line payment option.

Anyone know of something similar?

Thanks,

Bill Moseley
mailto:[EMAIL PROTECTED]



FreeBSD Apache/mod_perl/OpenSRS/expat problem + solution

2002-06-11 Thread Bill O'Hanlon


(Apologies if you see this twice -- I sent it from an unsubscribed
email address first.)


Hi folks,

I just ran down a problem that was somewhat hard to find, and I didn't see any
mention of anything like it in the archives anywhere.  I thought it might be
helpful to mention the details in case someone else is ever in the same
situation.

I'm running FreeBSD 4.5, with perl 5.6.1 and Apache 1.3.24.  I had a working
installation of the regular OpenSRS perl code via cgi-bin, but I thought I'd
get it running under Apache::Registry in mod_perl.  To my surprise, the Apache
daemons would dump core whenever I tried to log in with manage.cgi.

It turns out that the current FreeBSD port of Apache uses it's own internal
version of expat, which is an XML library of some kind.  This internal
version doesn't connect up well with the version that XML::Parser is expecting
to find.  Turning this off in the Apache build fixed the problem, and the
OpenSRS code runs very nicely under mod_perl now.  At this point, I don't
understand what functionality I've lost by not having the expat code built into
the Apache binary.

The configure option to leave out expat is --disable-rule=EXPAT.  In the
FreeBSD port, that's easily added to the CONFIGURE_ARGS variable in the
Makefile.

I don't know if this applies to any other platform.  My guess is that it could,
since I think the default for Apache is to use the internal version of expat.

Hope this helps someone!

--
Bill O'Hanlon   [EMAIL PROTECTED]
Professional Network Services, Inc. 612-379-3958
http://www.pro-ns.net




Re: Logging under CGI

2002-06-10 Thread Bill Moseley

At 10:30 PM 06/10/02 -0400, Sam Tregar wrote:
On Tue, 11 Jun 2002, Sergey Rusakov wrote:

 open(ERRORLOG, '/var/log/my_log');
 print ERRORLOG some text\n;
 close ERRORLOG;

 This bit of code runs in every apache child.
 I worry abount concurent access to this log file under heavy apache
load. Is
 there any problems on my way?

You are correct to worry.  You should use flock() to prevent your log file
from becoming corrupted.  See perldoc -f flock() for more details.

Maybe it's a matter of volume.  Or size of string written to the log.  But
I don't flock, and I keep the log file open between requests and only
reopen if stat() shows that the file was renamed.  So far been lucky.


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



FreeBSD Apache/mod_perl/OpenSRS/expat problem + solution

2002-06-09 Thread Bill O'Hanlon


Hi folks,

I just ran down a problem that was somewhat hard to find, and I didn't see any
mention of anything like it in the archives anywhere.  I thought it might be
helpful to mention the details in case someone else is ever in the same
situation. 

I'm running FreeBSD 4.5, with perl 5.6.1 and Apache 1.3.24.  I had a working
installation of the regular OpenSRS perl code via cgi-bin, but I thought I'd
get it running under Apache::Registry in mod_perl.  To my surprise, the Apache
daemons would dump core whenever I tried to log in with manage.cgi.

It turns out that the current FreeBSD port of Apache uses it's own internal
version of expat, which is an XML library of some kind.  This internal
version doesn't connect up well with the version that XML::Parser is expecting
to find.  Turning this off in the Apache build fixed the problem, and the
OpenSRS code runs very nicely under mod_perl now.  At this point, I don't
understand what functionality I've lost by not having the expat code built into
the Apache binary.

The configure option to leave out expat is --disable-rule=EXPAT.  In the
FreeBSD port, that's easily added to the CONFIGURE_ARGS variable in the
Makefile.

I don't know if this applies to any other platform.  My guess is that it could,
since I think the default for Apache is to use the internal version of expat.

Hope this helps someone!

-Bill

--
Bill O'Hanlon   [EMAIL PROTECTED]
Professional Network Services, Inc. 612-379-3958
http://www.pro-ns.net



Re: FreeBSD Apache/mod_perl/OpenSRS/expat problem + solution

2002-06-09 Thread Bill O'Hanlon

On Sun, Jun 09, 2002 at 12:43:38PM -0400, Perrin Harkins wrote:
  I just ran down a problem that was somewhat hard to find, and I didn't
 see any
  mention of anything like it in the archives anywhere.
 
 The expat issue has been discussed quite a bit on this list, and is
 documented here:
 http://perl.apache.org/guide/troubleshooting.html#Segfaults_when_using_X
 ML_Parser
 
 Sorry to hear you had trouble finding it.  That section of the guide is
 the first place you should look when you're having segfault problems.
 
 - Perrin
 


Eeek!  Thanks the for the pointer to the warnings and errors guide.

I guess when I first tried to find out what was going on by STFW, I thought it
was something OpenSRS specific.  Later, when I'd found something that seemed to
work, I didn't go back and search for expat or XML_Parse.

It's reassuring to see that the fix that I'd guessed at is the right one to
use, though.

Thanks again,

-Bill


--
Bill O'Hanlon   [EMAIL PROTECTED]
Professional Network Services, Inc. 612-379-3958
http://www.pro-ns.net



RE: [OT] MVC soup (was: separating C from V in MVC)

2002-06-08 Thread Bill Moseley

At 12:13 PM 06/08/02 +0100, Jeff wrote:
The responsibility of the Controller is to take all the supplied user
input, translate it into the correct format, and pass it to the Model,
and watch what happens. The Model will decide if the instruction can be
realised, or if the system should explode.

I'd like to ask a bit more specific question about this.  Really two
questions.  One about abstracting input, and, a bit mundane, building links
from data set in the model.

I've gone full circle on handling user input.  I used to try to abstract
CGI input data into some type of request object that was then passed onto
the models.  But then the code to create the request object ended up
needing to know too much about the model.

For example, say for a database query the controller can see that there's a
query parameter and thus knows to pass the request to the code that knows
how to query the database.  That code passes back a results object which
then the controller can look at to decide if it should display the results,
a no results page and/or the query form again.

Now, what happens is that features are added to the query code.  Let's say
we get a brilliant idea that search results should be shown a page at a
time (or did Amazon patent that?).  So now we want to pass in the query,
starting result, and the page size.

What I didn't like about this is I then had to adjust the so-called
controller code that decoded the user input for my request object to
include these new features.  But really that data was of only interest to
the model.  So a change in the model forced a change in the controller.

So now I just have been passing in an object which has a param() method
(which, lately I've been using a CGI object instead of an Apache::Request)
so the model can have full access to all the user input.  It bugs me a bit
because it feels like the model now has intimate access to the user input.

And for things like cron I just emulate the CGI environment.

So my question is: Is that a reasonable approach?

My second, reasonably unrelated question is this: I often need to make
links back to a page, such as a link for page next.  I like to build
links in the view, keeping the HTML out of the model if possible.  But for
something like a page next link that might contain a bunch of parameters
it would seem best to build href in the model that knows about all those
parameters.

Anyone have a good way of dealing with this?

Thanks,

P.S. and thanks for the discussion so far.  It's been very interesting.


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



[OT] MVC soup (was: separating C from V in MVC)

2002-06-06 Thread Bill Moseley

I, like many, find these discussion really interesting.  I always wish
there was some write up for the mod_perl site when all was said and done.
But I guess one of the reasons it's so interesting is that there's more
than one correct point of view.

My MVC efforts often fall apart in the C an M separation.  My M parts end
up knowing too much about each other -- typically because of error
conditions e.g. data that's passed to an M that does not validate.  And I
don't want to validate too much data in the C as the C ends up doing M's work.

Anyone have links to examples of MVC Perl code (mostly controller code)
that does a good job of M and C separation, and good ways to propagate
errors back to the C?  


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Scope of Perl Special Variables

2002-05-05 Thread Bill Catlan

Stas Bekman wrote:

 This explains why by default %ENV is set for each request afresh.
 http://perl.apache.org/guide/performance.html#PerlSetupEnv_Off

Great.  Thank you Stas.  Now I know /how/ that happens, but I don't know /why/
the existing inctances' ENV is not clobbered.

My guess is that a localized copy of the %ENV variable is created by the above
referenced process, thus no clobbering of existing instances' %ENV occurs.
Would that be correct?

-Bill




Re: Scope of Perl Special Variables

2002-05-05 Thread Bill Catlan

 I thought that using 'local' would successfully scope those globals to
 within a sub, so you could,k for example, slurp an entire file by doing:

 local $/ = undef;
 my $file = FH;

 Or am I wrong in that?  I use it frequently, and don't seem to have any
 troubles.

 --Jon R.

It is my understanding that that is correct.  I am a novice at mod_perl, but
your experience would seem to match up with my understanding of the Guide.

Local would scope it to within the enclosing block; so you could scope a
variable within a bare block so that it would be local to you package, but
shareable between subs.

# $/ equals default, global value
{
local $/ = undef;

sub { ... # $/ equals undef }
sub { ... # $/ equals undef }
sub { local $/ = \n\n; # localized value for sub }

# $/ back to undef
}
# $/ back to default, global value

-Bill




Throttling, once again

2002-04-18 Thread Bill Moseley

Hi,

Wasn't there just a thread on throttling a few weeks ago?

I had a machine hit hard yesterday with a spider that ignored robots.txt.  

Load average was over 90 on a dual CPU Enterprise 3500 running Solaris 2.6.
 It's a mod_perl server, but has a few CGI scripts that it handles, and the
spider was hitting one of the CGI scripts over and over.  They were valid
requests, but coming in faster than they were going out.

Under normal usage the CGI scripts are only accessed a few times a day, so
it's not much of a problem have them served by mod_perl.  And under normal
peak loads RAM is not a problem.  

The machine also has bandwidth limitation (packet shaper is used to share
the bandwidth).  That combined with the spider didn't help things.  Luckily
there's 4GB so even at a load average of 90 it wasn't really swapping much.
 (Well not when I caught it, anyway).  This spider was using the same IP
for all requests.

Anyway, I remember Randal's Stonehenge::Throttle discussed not too long
ago.  That seems to address this kind of problem.  Is there anything else
to look into?  Since the front-end is mod_perl, it mean I can use mod_perl
throttling solution, too, which is cool.

I realize there's some fundamental hardware issues to solve, but if I can
just keep the spiders from flooding the machine then the machine is getting
by ok.

Also, does anyone have suggestions for testing once throttling is in place?
 I don't want to start cutting off the good customers, but I do want to get
an idea how it acts under load.  ab to the rescue, I suppose.

Thanks much,


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: mod_perl and DB2

2002-04-12 Thread Bill McCabe

On 4/11/02 at 9:54 AM, [EMAIL PROTECTED] (David Shrewsbury) wrote:

 Hey gang. Couldn't find an answer to this in the archives. We have a
 DB2 database that we access via mod_perl scripts.  We have been
 getting errors in the Apache log files every morning whenever we
 first try to access the database. The error is:
 
 [Thu Apr 11 09:09:49 2002] null: DBD::DB2::db selectall_arrayref failed:
 [IBM][C
 LI Driver] CLI0108E  Communication link failure. SQLSTATE=40003 at
 /usr/local/ap
 achessl/perl/reports/trans_history.pl line 90
 
 It takes a restart of the web server to eliminate this problem.
 Is this the Morning Bug that I am encountering?  I looked at the
 DB2.pm module and it appears that the ping() method has been
 implemented so I would think that this would prevent the Morning
 Bug from showing up.  Should I reimplement the ping() method
 according to the Apache::DBI manpage?
 
 -David
 

I have many internal systems which use Apache::DBI and DBD::DB2. They're often
up for months without being restarted, and I don't use ping(). So, it is
possible. Are you the DBA too? A possibility is that the DB is being brought
down into offline mode during the night for backups. That'd kill your cached
connections.

I'll send a fuller answer when I have a chance to reexamine our set up.

Bill



RE: mod_perl Cook Book

2002-04-06 Thread Bill Moseley

At 09:59 AM 04/06/02 +0100, Phil Dobbin wrote:

It's definitely the book to buy _before_ the Eagle book.

No, buy both at the same time.  I think the Eagle gives a really good
foundation, and it's very enjoyable reading (regardless of what my wife
says!).

I still think the Eagle book is one of the best books on my bookshelf.  I
have a couple of Apache-specific books and I learned a lot more about
Apache from the Eagle than those.  The cook book has been a great addition.


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Apache::VMonitor not showing requests (might be Apache::Scoreboard problem)

2002-04-04 Thread Bill Marrs

Maybe adding this to your httpd.conf will help:

ExtendedStatus On

?




RE: PDF generation

2002-04-04 Thread Bill McCabe

On 4/4/02 at 1:07 PM, [EMAIL PROTECTED] (Wilson, Allen) wrote:

 In reference to PDF::Create...
 
 Has anyone found any good documentation behind the module...
 
 I would like to print the results of a query to PDF and I not exactly
 sure whether I can use an array or a concatenate the results in a
 string.
 
 Allen
 

AFAIK, the only docs are what comes with the distribution. For tabluar reports,
I retrieve all my results as array's which get fed to a function that calculates
x,y pixel offsets and print each item to its position. If you want to make a
string out of each row, you'll have to use a fixed width font and calculate the
needed spaces to pad each column and you'll still need to track vertical pixel
position for each row, so it didn't seem to be  much less work to me to go that
route. 

Anyway, as I pointed out in my initial question, though PDF::Create has worked
flawlessly for me for the last two years, it definitely seems to have been
orphaned by its creator. If you're just getting started with perl-PDF
generation, why not choose one of those recommended in this thread? My
inclination would be use Matt Sergeant's PDFlib which seems well designed, but I
have a (probably unjustifiable) knee-jerk reaction against having to license
products. So, I'll give PDF::API2 a whack.

Bill



PDF generation

2002-04-03 Thread Bill McCabe

Hi All

I have a large number of mod_perl modules that connect to various databases and
generate workflow performance reports for my organization. I give the users 3
output options: HTML, Excel (Spreadsheet::WriteExcel), and PDF. For PDF output
I've been using PDF::Create, which has been at version .01 since 1999. It has
worked flawlessly for my purposes for a couple of years, but is very limited. In
fine form-follows-function fashion, the end users would now like the PDF output
gussied up with graphics, etc. Does anyone have any strong (positive or
negative) recommendations for which module(s) I should migrate to?


TIA,
Bill



Re: 'Pinning' the root apache process in memory with mlockall

2002-03-23 Thread Bill Marrs

At 10:53 PM 3/22/2002, Stas Bekman wrote:
top and libgtop use the same source of information, so it has nothing to 
do with these tools.

'top' has the ability to display SWAP on a per-process basis (you have to 
change the defaults to see it, but it's there).

I didn't find this per-process SWAP value in Gtop.pm anywhere.

If GTop.pm had it, I could fix GTopLimit's bug.

-bill




Re: 'Pinning' the root apache process in memory with mlockall

2002-03-22 Thread Bill Marrs

Stas,

Thanks for tracking that down.

So, the problem is our tools.  For me, that's GTopLimit (but also SizeLimit).

I would think it must be possible to cajole these two into realizing their 
error.  top seems to know how much a process has swapped.  If GTopLimit 
could know that, the number could be subtracted from the total used in 
calculating the amount of sharing (and the new unshared), then this bug 
would be resolved, right?

I looked, but didn't see anything in Gtop.pm that gives swap per process, 
though.  So, it's not going to be easy.

I guess I'll turn of me deswapper...

...and GTopLimit as well.  for now...

hmm,  maybe I could just avoid using the share-related trigger values in 
GTopLimit, and just use the SIZE one.  That would be an acceptable 
compromise, though not the best.

-bill




Re: Non-web use for Apache/mod_perl

2002-03-21 Thread Bill McCabe

Over the last year I've been slowly working on a similar system in my spare time
(of which I have none). To do systems monitoring and reporting I'm using
mod_perl on the front end and communicating with remote systems via XML::RPC.
The XML::RPC server on the remote system runs local command via perl wrappers.
These wrappers are now returning raw command output, but my next step is to have
them convert it first to XML and then return it, where it can be stored in a DB
(I use DB2) and/or processed by AxKit. I decided to go this way so I could
delegate the customization of the local wrappers to other admins, because our
site uses so many different platforms (AIX, Sun, linux, BSD, all stripes of
Windows, OS/2, AS/400, OS/390, MacOS 9, MacOS X, etc etc). So far I've only been
monitoring and reporting disk usage, just to get up and running.



Bill



RE: loss of shared memory in parent httpd

2002-03-16 Thread Bill Marrs


The reason turning off swap works is because it forces the memory from
the parent process that was swapped out to be swapped back in.  It will
not fix those processes that have been sired after the shared memory
loss, as of Linux 2.2.15 and Solaris 2.6.  (I have not checked since
then for behavior in this regard, nor have I checked on other OSes.)

In my case, I'm using Linux 2.4.17, when I turn off swap and turn it back 
on again, it restores the shared memory of both the parent and the children 
Apache processes.

This seems counter-intuitive, as it would seem the kernel memory manager 
would have to bend over backwards to accomplish this re-binding of the 
swapped-out shared memory pages.

Thus, it leads ones to wonder if some of our assumptions or tools used to 
monitor memory are inaccurate or we're misinterpreting them.

-bill




Re: Creating a proxy using mod_perl

2002-03-15 Thread Bill Moseley

At 05:11 PM 3/15/2002 +0300, Igor Sysoev wrote:
On Fri, 15 Mar 2002, Marius Kjeldahl wrote:

 I guess these all suffer from the fact that the parameters have to be 
 specified in httpd.conf, which makes it impossible to pass a url to 
 fetch from in a parameter, right?

So mod_rewite with mod_proxy or mod_accel:

RewriteRule   /proxy_url=http://(.+)$http://$1   [L,P]

Note that 'proxy?url=' is changed to 'proxy_url='.

Any concern about being an open proxy there?  I'd want to only proxy the
sites I'm working with.  

I'd rather cache the images locally, just in case you are working with a
slow site or if they do something silly like check referer on requests.



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [ANNOUNCE] The New mod_perl logo - results now in...

2002-03-15 Thread Bill Moseley

At 04:33 PM 03/15/02 -0500, Georgy Vladimirov wrote:
I actually like the logo without the underscore. I don't think an
underscore is very collaborative with art. The _ has always been
irritating me a little.

I know that there is history and nostalgia involved here but dropping
an underscore at least in the logo is a nice evolution IMHO.

I also agree with this, and is one of the reasons (I think) I voted for
that design.  It's a graphic design so I don't see that it needs to follow
the Apache module naming convention exactly.  Nor perl identifier names,
either.  Many of the designs offered didn't use the underscore as well.
And the design that won didn't use one.  It's a design -- it doesn't have
to be accurate to the name.

Besides, if it changes does it mean that the winning design received no
votes? ;)


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



RE: loss of shared memory in parent httpd

2002-03-14 Thread Bill Marrs


It's copy-on-write.  The swap is a write-to-disk.
There's no such thing as sharing memory between one process on disk(/swap)
and another in memory.

agreed.   What's interesting is that if I turn swap off and back on again, 
the sharing is restored!  So, now I'm tempted to run a crontab every 30 
minutes that  turns the swap off and on again, just to keep the httpds 
shared.  No Apache restart required!

Seems like a crazy thing to do, though.

You'll also want to look into tuning your paging algorithm.

Yeah... I'll look into it.  If I had a way to tell the kernel to never swap 
out any httpd process, that would be a great solution.  The kernel is 
making a bad choice here.  By swapping, it triggers more memory usage 
because sharing removed on the httpd process group (thus multiplied)...

I've got MaxClients down to 8 now and it's still happening.  I think my 
best course of action may be a crontab swap flusher.

-bill




Re: Apache and Perl with Virtual Host

2002-03-14 Thread Bill Marrs

At 04:02 AM 3/14/2002, Matt Phelps wrote:
Forgive me if I'm posting to the wrong group. Ive got apache 1.3.22 
running several virtual webs. I can get perl scripts to run under the 
default web but not in the others. All the webs point to the same script 
folder. If I try to run the script under a virtual web, all I get is text 
display. Any help would be great.

Well, I use mod_perl with VituralHosts...  My config looks something like:

VirtualHost gametz.com
ServerAdmin [EMAIL PROTECTED]
DocumentRoot /home/tz/html
ServerName gametz.com
DirectoryIndex /perl/gametz.pl
# The live area
Alias /perl/ /home/tz/perl/
Location /perl
   AllowOverride  None
   SetHandler perl-script
   PerlHandler Apache::RegistryBB
   PerlSendHeader On
   Options+ExecCGI
/Location
/VirtualHost

VirtualHost surveycentral.org
ServerAdmin [EMAIL PROTECTED]
DocumentRoot /projects/web/survey-central
ServerName surveycentral.org
DirectoryIndex /perl/survey.pl

Alias /perl/ /projects/web/survey-central/perl/
Location /perl
   SetHandler perl-script
   PerlHandlerApache::RegistryBB
   PerlSendHeader On
   Options+ExecCGI
/Location
/VirtualHost





Re: [OT]RE: loss of shared memory in parent httpd

2002-03-14 Thread Bill Marrs


How is it even remotely possible that turning off swap restores memory
shared between processes? Is the Linux kernel going from process to process
comparing pages of memory as they re-enter RAM? Oh, those two look
identical, they'll get shared?

This is a good point.  I really have no clue how the kernel deals with 
swapping/sharing, so I can only speculate.  I could imagine that it's 
possible for it to do this, if the pages are marked properly, they could be 
restored.  But, I'll admit, it seems unlikely.

...and, I had this thought before.  Maybe this apparent loss of shared 
memory is an illusion.  It appears to make the amount of memory that the 
httpds use grow very high, but perhaps it is a kind of shared-swap, and 
thus the calculation I'm using to determine overall memory usage would need 
to also factor out swap.  ...in which case, there's no problem at all.

But, I do see an albeit qualitative performance increase and CPU load 
lowering when I get the httpds to stay shared (and unswapped).  So, I think 
it does matter.

Though, if you think about it, it sort of makes sense.  Some portion of the 
shared part of the httpd is also not being used much, so it gets swapped 
out to disk.  But, if those pages really aren't being used, then there 
shouldn't be a performance hit.  If they are being used, then they'd get 
swapped back in.

...which sort of disproves my qualitative reasoning that swap/unshared is bad.

my head hurts, maybe I should join a kernel mailing list and see is someone 
there can help me (and if I can understand them).

-bill






Re: loss of shared memory in parent httpd

2002-03-13 Thread Bill Marrs

I just wanted to mention that the theory that my loss of shared memory in 
the parent is related to swapping seems to be correct.

When the lack of sharing occurs, it is correlated with my httpd processes 
showing a SWAP (from top/ps) of 7.5MB, which is roughly equal to the amount 
of sharing that I lose.

I've been lowering my MaxClients setting (from 25 to 10, so far) in hopes 
of finding a new balance where SWAP is not used, and more RAM is on order.

Thanks
-bill




loss of shared memory in parent httpd

2002-03-12 Thread Bill Marrs

I'm a heavy mod_perl user, running 3 sites as virtual servers, all with 
lots of custom Perl code.  My httpd's are huge(~50mb), but with the help of 
a startup file I'm able to get them sharing most of their 
memory(~43).  With the help of GTopLimit, I'm able to keep the memory usage 
under control.

But... recently, something happened, and things have changed.  After some 
random amount of time (1 to 40 minutes or so, under load), the parent httpd 
suddenly loses about 7-10mb of share between it and any new child it 
spawns.  As you can imagine, the memory footprint of my httpds skyrockets 
and the delicate balance I set up is disturbed.  Also, GTopLimit is no help 
in this case - it actually causes flailing because each new child starts 
with memory sharing that is out of bounds and is thus killed very quickly.

Restarting Apache resets the memory usage and restores the server to smooth 
operation.  Until, it happens again.

Using GTop() to get the shared memory of each child before and after 
running my perl for each page load showed that it wasn't my code causing 
the jump, but suddenly the child, after having a good amount of shared 
memory in use, loses a 10MB chunk and from then on the other children 
follow suit.

So, something I did on the server (I'm always doing stuff!) has caused this 
change to happen, but I've been pulling my hair out for days trying to 
track it down.  I am now getting desperate.  One of the recent things I did 
was to run tux (another web server) to serve my images, but I don't see 
how that could have any effect on this.

If anyone has any ideas what might cause the httpd parent (and new 
children) to lose a big chunk of shared memory between them, please let me 
know.

Thanks in advance,

-bill




Re: loss of shared memory in parent httpd

2002-03-12 Thread Bill Marrs

Thanks for all the great advice.

A number of you indicated that it's likely due to my apache processes being 
partially swapped to disk.  That seems likely to me.  I haven't had a 
chance to prove that point, but when it does it again and I'm around, I 
plan to test it with free/top (top has a SWAP column which should show if 
my apaches are swapped out at all).

I am in the process of getting a memory upgrade, so that should ease this 
problem.  Meanwhile, I can set MaxClients lower and see if that keeps me 
out of trouble as well.

I suspect adding the tux server disrupted the balance I had (apparently, I 
was tuned pretty close to my memory limits!)

Yes, I am running on Linux...

One more piece of advice: I find it easier to tune memory control with a 
single parameter.  Setting up a maximum size and a minumum shared size is 
not as effective as setting up a maximum *UNSHARED* size.  After all, it's 
the amount of real memory being used by each child that you care about, 
right?  Apache::SizeLimit has this now, and it would be easy to add to 
GTopLimit (it's just $SIZE - $SHARED).  Doing it this way helps avoid 
unnecessary process turnover.

I agree.  For me, with my ever more bloated Perl code, I find this unshared 
number to be easier to keep a lid on.  I keep my apache children under 10MB 
each unshared as you say.  That number is more stable that the 
SIZE/SHARED numbers that GTopLimmit offers.  But, I have the GTopLimit 
sources, so I plan to tweak them to allow for an unshared setting.  I think 
I bugged Stas about this a year ago and he had a reason why I was wrong to 
think this way, but I never understood it.

-bill




trouble with GTop and

2002-03-12 Thread Bill Marrs

When I install the recent Redhat 7.2 updates for glibc:

glibc-2.2.4-19.3.i386.rpm
glibc-common-2.2.4-19.3.i386.rpm
glibc-devel-2.2.4-19.3.i386.rpm

It breaks my Apache GTop-based Perl modules, in a way that I don't understand.

Here is are the error messages from my httpd/error_log:

[Tue Mar 12 15:44:37 2002] [error] Can't load 
'/usr/lib/perl5/site_perl/5.6.0/i386-linux/auto/GTop/GTop.so' for module 
GTop: /usr/lib/libgdbm.so.2: undefined symbol: gdbm_errno at 
/usr/lib/perl5/5.6.1/i386-linux/DynaLoader.pm line 206.
  at /usr/lib/perl5/site_perl/5.6.0/i386-linux/GTop.pm line 12
Compilation failed in require at 
/usr/lib/perl5/site_perl/5.6.0/Apache/GTopLimit.pm line 144.
BEGIN failed--compilation aborted at 
/usr/lib/perl5/site_perl/5.6.0/Apache/GTopLimit.pm line 144.
Compilation failed in require at /home/httpd/startup.pl line 32.
BEGIN failed--compilation aborted at /home/httpd/startup.pl line 32.
Compilation failed in require at (eval 1) line 1.

Syntax error on line 1017 of /etc/httpd/conf/httpd.conf:
Can't load '/usr/lib/perl5/site_perl/5.6.0/i386-linux/auto/GTop/GTop.so' 
for module GTop: /usr/lib/libgdbm.so.2: undefined symbol: gdbm_errno at 
/usr/lib/perl5/5.6.1/i386-linux/DynaLoader.pm line 206.
  at /usr/lib/perl5/site_perl/5.6.0/i386-linux/GTop.pm line 12
Compilation failed in require at 
/usr/lib/perl5/site_perl/5.6.0/Apache/GTopLimit.pm line 144.
BEGIN failed--compilation aborted at 
/usr/lib/perl5/site_perl/5.6.0/Apache/GTopLimit.pm line 144.
Compilation failed in require at /home/httpd/startup.pl line 32.
BEGIN failed--compilation aborted at /home/httpd/startup.pl line 32.
Compilation failed in require at (eval 1) line 1.


Anyone have a clue about what I'd need to do to get this working?  I am 
able to force the old glibc rpms back on to fix the problem.  The previous 
versions that work for me are:

glibc-common-2.2.4-13
glibc-devel-2.2.4-13
glibc-2.2.4-13

-bill







[WOT] Google Programming Contest.

2002-02-07 Thread Bill Moseley

Sorry for the Way Off Topic, and sorry if I missed this on the list already:

http://www.google.com/programming-contest/

They say C++ or Java.  What, no Perl?


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: New mod_perl Logo

2002-01-29 Thread Bill Moseley

At 07:29 PM 01/29/02 -0500, Chris Thompson wrote:
Well, I'd like to just throw one idea into the mix. It's something that's
bugged me for a long time, no better time than the present.

mod_perl is a lousy name.

I don't know about lousy, but I do agree.   I brought this up on the
docs-dev list:

  http://search.apache.org/archives/docs-dev/0236.html

During the week I posted that I had run into PHP programmers at a computer
show, more PHP programmers at a pub (2 in the afternoon -- more out of work
programmers), and ended up talking with a couple of Java programmers one
day.  The amazing thing was they all had a completely weird idea about what
mod_perl is or what it does.  And all thought it was slow, old, dead, not
scalable,  technology.  And that was from programmers, not managers.  We
all know there is a lot of misinformation out there.

Marketing is not everything, but it's a lot!  What we know of mod_perl is
more than just perl+Apache, really.  It's a development platform, or
development suite.  It can be anything our marketing department says it is. ;)

In these tough economic times, repackaging might be helpful.  Who knows?

And for some of us we know that mod_perl is also something that makes up a
chunk of our livelihood.  So, the promotion of mod_perl is quite important,
unless we want to start spending more afternoons with those PHP programmers
down at the corner pub.

So how would a group like the mod_perl community promote itself in new
ways?  Well, other professionals often have professional organizations or
associations to represent and promote their members.  I wonder if there are
there enough mod_perl programmers to support something like that.  Even if
there were, what could be done?  Run a few print ads in magazines that
system admins read?  Hire an ad firm for help in developing our brand?
mod_perl coffee mugs? (Tired of that old cup of Java?)  Free mod_perl
clinics?  Hard to imagine any of that actually happening, really.

So what's a group of programmers to do?

The new web site should help, to some degree, but I'm not sure it will
change any manager's mind on the technology they pick to run their
applications.

Of course, most people here have access to big pipes.  So, there's always
bulk mail ads.  I got mail just today saying that it's an effective way to
advertise.  In fact I got about ten of those today!


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: META tags added as HTTP headers

2002-01-18 Thread Bill Moseley


At 01:20 AM 01/19/02 +0100, Markus Wichitill wrote:
which part of an Apache/mod_perl setup is responsible for extracting META
tags from generated HTML and adding them as HTTP headers (even with
PerlSendHeaders Off)?

That's lwp doing that, not Apache or mod_perl.

 HEAD http://www.apache.org
200 OK
Cache-Control: max-age=86400
Connection: close
Date: Sat, 19 Jan 2002 00:27:10 GMT
Accept-Ranges: bytes
Server: Apache/2.0.28 (Unix)
Content-Length: 7810
Content-Type: text/html
Expires: Sun, 20 Jan 2002 00:27:10 GMT
Client-Date: Sat, 19 Jan 2002 00:27:17 GMT
Client-Request-Num: 1
Client-Warning: LWP HTTP/1.1 support is experimental


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: META tags added as HTTP headers

2002-01-18 Thread Bill Moseley

At 04:46 PM 01/18/02 -0800, ___cliff rayman___ wrote:
hmmm - you are still using lwp.

Right.  But lwp-request sends a GET request where HEAD sends, well, a HEAD
request.  So, even though LWP's default is to parse the head section,
there's no content to parse in a HEAD request, and thus the meta headers
don't show up.


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Alarms?

2002-01-10 Thread Bill Moseley

At 06:56 PM 01/10/02 +0300, [EMAIL PROTECTED] wrote:
Hello!

I'm getting lots of errors in log:

[Thu Jan 10 18:54:33 2002] [notice] child pid 8532 exit signal Alarm clock
(14)

I hope I remember this correctly:

What's happening is you are setting a SIGALRM handler in perl, but perl is not 
correctly restoring Apache's handler when yours goes out of scope.

So then a non-mod_perl request times out there's not handler and the process is killed.

Check:
http://thingy.kcilink.com/modperlguide/debug/Debugging_Signal_Handlers_SIG_.html

http://thingy.kcilink.com/modperlguide/debug/Handling_Server_Timeout_Cases_an.html
-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Template-Toolkit performance tuning

2001-12-30 Thread Bill Moseley

At 05:17 PM 12/30/01 -0600, Ryan Thompson wrote:
   use Template;

   my %vars;

   $var{foo} = bar;   # About 30 scalars like this
   .
   .

   my $tt = new Template({INTERPOLATE = 1});

Cache your template object between requests.


-- 
Bill Moseley
mailto:[EMAIL PROTECTED]



Searchable archives (was: [modperl site design challenge] and the winner is...)

2001-12-26 Thread Bill Moseley

At 02:13 PM 12/24/01 +0800, Stas Bekman wrote:
FWIW, we are having what seems to be a very productive discussion at 
docs-dev mailing list. Unfortunately no mail archiver seem to pick this 
list up, so only the mbox files are available:
http://perl.apache.org/mail/docs-dev/

Is anyone up to make the searchable archives available? We have a bunch 
of lists that aren't browsable/searchable :(
http://perl.apache.org/#maillists

Hi Stas,

Any reason to not use hypermail?  Do you have mbox files for all the lists
in question?

I could setup searchable archives like this example, if you like.

   http://search.apache.org/docs-dev/  (this URL is temporary!)



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Can I use mod_perl to pass authentication details to apache from an HTML form?

2001-12-24 Thread Bill Moseley

At 08:49 AM 12/24/2001 -, Chris Thompson wrote:
I would like to set up a password-protected area within my website where my
web design clients can preview changes to their sites before they go live.
Although I know how to password protect directories using .htaccess files, I
would prefer to bypass the standard grey Authorization pop-up screen and
instead let users enter their username / password details through an HTML
form (which I think would look more professional).

Take a look at Apache::AuthCookie.

If possible, the system for authenticating / authorizing the user would also
redirect them to the appropriate directory for their site.

You can do that, or you can just have them go directly to their area, and
have the authentication system intercept the request.  This is what
Apache::AuthCookie does.  You might also look at Apache::AuthCookieURL if
there's a chance that your users might not have cookies enabled.



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [modperl site design challenge] and the winner is...

2001-12-19 Thread Bill Moseley


I'm throwing in my two cents a bit late, so it's a bit depreciated now (one
cent?).  But something to think about for the site.

I've worked with php a little lately -- not programming, but making minor
changes to a site.  I've used the php site http://www.php.net/ a few times,
and I've found it reasonably functional, but also quite easy for someone
new to php.  Maybe it seems that way because I know nothing about php and
it's geared toward my level.  But that's good.  How often to the mod_perl
pros need to read the mod_perl home page?

I'm sure all these elements will be added to the new mod_perl site in some
way, but I just wanted to note what I liked about the php site.  And I'm
not comparing mod_perl to php!

What the php site shows in a real obvious way is:

1) what is php (for someone that is brand new) with a link to some basic
examples.  It demystifies php in a hurry.  Makes someone think Oh, I can
do that.

2) currently, it's showing Netcraft's usage stats, so I see that people are
using it in growing numbers -- it's not a dead-end for a new person to try
out.

3) it shows upcoming events.  That shows that there's a real support group
of real people to work with.  Links to discussion lists archives would be
good there.

All that makes it really easy for someone new to feel comfortable.

It would be nice to see license info, too, as someone new might want to be
clear on that right away, too.

You can also quickly see a list of supported modules.  This shows that it's
easy to extend, but also allows someone to see that it can do the thing
*they* might be interested in.  Sure, perl has CPAN, but I think it would
be good to show a list of commonly used modules for mod_perl, and what they
do, in a simple list.  If someone is just learning about mod_perl (or php)
the list doesn't need to be that big, as their needs will be reasonably basic.

Existing mod_perl (or php?) programmers might not like all that basic,
first-time user stuff right on the home page, and would rather have a more
functional site.  I don't know about anyone else, but I've got the links
I need bookmarked, and if not I go to perl.apache.org and ^F right to where
I want to go.

BTW -- At first I liked David's idea of using the ASF look.  That ties
mod_perl to apache well.  But, if the site is intended to bring in new
users, it might be good to be a bit more flashy.

crazy idea
Maybe as a community (of programmers not designers) we could hire a
professional designer to help develop our brand.  Cool web site.  Some
print ads in the trades.  What's a small amount in dues to the Association
of Mod_perl Programmers compared to increase of mod_perl work overall?
/crazy idea


Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Comparison of different caching schemes

2001-12-18 Thread Bill Moseley
Ok, I'm a bit slow...

At 03:05 PM 12/12/01 +1100, Rob Mueller (fastmail) wrote: 
>>>>
Just thought people might be interested...

Seems like they were!  Thanks again.

I didn't see anyone comment about this, but I was a bit surprised by MySQLs good performance.  I suppose caching is key.  I wonder if things would change with 50 or 100 thousand rows in the table.

I always assumed something like Cache::FileCache would have less overhead than a RDMS.  It's impressive.


>>>>
Now to the results, here they are.
Package C0 - In process hash
Sets per sec = 147116
Gets per sec = 81597
Mixes per sec = 124120
Package C1 - Storable freeze/thaw
Sets per sec = 2665
Gets per sec = 6653
Mixes per sec = 3880
Package C2 - Cache::Mmap
Sets per sec = 809
Gets per sec = 3235
Mixes per sec = 1261
Package C3 - Cache::FileCache
Sets per sec = 393
Gets per sec = 831
Mixes per sec = 401
Package C4 - DBI with freeze/thaw
Sets per sec = 651
Gets per sec = 1648
Mixes per sec = 816
Package C5 - DBI (use updates with dup) with freeze/thaw
Sets per sec = 657
Gets per sec = 1994
Mixes per sec = 944
Package C6 - MLDBM::Sync::SDBM_File
Sets per sec = 334
Gets per sec = 1279
Mixes per sec = 524
Package C7 - Cache::SharedMemoryCache
Sets per sec = 42
Gets per sec = 29
Mixes per sec = 32





Bill Moseley
mailto:[EMAIL PROTECTED] 

Re: [RFC] Apache::CacheContent - Caching PerlFixupHandler

2001-12-06 Thread Bill Moseley

At 08:19 AM 12/06/01 -0800, Paul Lindner wrote:

Ok, hit me over the head.  Why wouldn't you want to use a caching proxy?

BTW -- I think where the docs are cached should be configurable.  I don't
like the idea of the document root writable by the web process.



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [RFC] Apache::CacheContent - Caching PerlFixupHandler

2001-12-06 Thread Bill Moseley

At 10:33 AM 12/06/01 -0800, Paul Lindner wrote:
On Thu, Dec 06, 2001 at 10:04:26AM -0800, Bill Moseley wrote:
 At 08:19 AM 12/06/01 -0800, Paul Lindner wrote:
 
 Ok, hit me over the head.  Why wouldn't you want to use a caching proxy?

Apache::CacheContent gives you more control over the caching process
and keeps the expiration headers from leaking to the browser.

Ok, I see.

Or maybe you want to dynamically control the TTL?

Would you still use it with a front-end lightweight server?  Even with
caching, a mod_perl server is still used to send the cached file (possibly
over 56K modem), right?



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Hi

2001-12-04 Thread Bill Moseley

At 05:13 PM 12/04/01 -0500, Robert Landrum wrote:
If this guy is going to be sending us shit all night, I suggest we 
deactivate his account.

Now that would be fun!  Oh, you mean by unsubscribing him.  I was thinking
of something more sporting.  What's the collective bandwidth of the people
on this list?

Just kidding.




Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [OT] log analyzing programs

2001-12-02 Thread Bill Moseley

At 10:09 AM 12/2/2001 +, Matt Sergeant wrote:

   PID USERNAME THR PRI NICE  SIZE   RES STATE   TIMECPU COMMAND
 17223 operator   1  442  747M  745M cpu14  19.2H 45.24% wusage

Ouch. Try analog.

  PID USERNAME THR PRI NICE  SIZE   RES STATE   TIMECPU COMMAND
17223 operator   1   02  747M  745M cpu14  27.1H 47.57% wusage

Well at least after another 8 hours of CPU it's not leaking ;)



Bill Moseley
mailto:[EMAIL PROTECTED]



[OT] log analyzing programs

2001-12-01 Thread Bill Moseley

Any suggestions for favorite ones?  wusage seems to require a lot of
resources -- maybe that's not unusual?  It runs once a week.  Here's a
about six days worth of requests.  Doesn't see like that many.

%wc -l access_log
 1185619 access_log

  PID USERNAME THR PRI NICE  SIZE   RES STATE   TIMECPU COMMAND
17223 operator   1  442  747M  745M cpu14  19.2H 45.24% wusage



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [OT] Re: search.cpan.org

2001-11-27 Thread Bill Moseley

At 12:55 PM 11/27/01 -0800, Nick Tonkin wrote:

Because it does a full text search of all the contents of the DB.

Perhaps, but it's just overloaded.

I'm sure he's working on it, but anyone want of offer Graham free hosting?
A few mirrors would be nice, too.

(Plus, all my CPAN.pm setups are now failing to work, too)



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [OT] Re: search.cpan.org

2001-11-27 Thread Bill Moseley

At 09:02 PM 11/27/01 +, Mark Maunder wrote:
I'm using it on our site and searching fulltext
indexes on three fields (including a large text field) in under 3 seconds
on over
70,000 records on a p550 with 490 megs of ram.


Hi Mark,

plug

Some day if you are bored, try indexing with swish-e (the development
version).
http://swish-e.org

The big problem with it right now is it doesn't do incremental indexing.
One of the developers is trying to get that working with in a few weeks.
But for most small sets of files it's not an issue since indexing is so fast.

My favorite feature is it can run an external program, such as a perl mbox
or html parser or perl spider, or DBI program or whatever to get the source
to index.  Use it with Cache::Cache and mod_perl and it's nice and fast
from page to page of results.

Here's indexing only 24,000 files:

 ./swish-e -c u -i /usr/doc
Indexing Data Source: File-System
Indexing /usr/doc
270279 unique words indexed.
4 properties sorted.  
23840 files indexed.  177638538 total bytes.
Elapsed time: 00:03:50 CPU time: 00:03:16
Indexing done!

Here's searching:

 ./swish-e -w install -m 1
# SWISH format: 2.1-dev-24
# Search words: install
# Number of hits: 2202
# Search time: 0.006 seconds
# Run time: 0.011 seconds

A phrase:

 ./swish-e -w 'public license' -m 1
# SWISH format: 2.1-dev-24
# Search words: public license
# Number of hits: 348
# Search time: 0.007 seconds
# Run time: 0.012 seconds
998 /usr/doc/packages/ijb/gpl.html gpl.html 26002


A wild card and boolean search:

 ./swish-e -w 'sa* or java' -m 1
# SWISH format: 2.1-dev-24
# Search words: sa* or java
# Number of hits: 7476
# Search time: 0.082 seconds
# Run time: 0.087 seconds

Or a good number of results:

 ./swish-e -w 'is or und or run' -m 1
# SWISH format: 2.1-dev-24
# Search words: is or und or run
# Number of hits: 14477
# Search time: 0.084 seconds
# Run time: 0.089 seconds

Or everything:

 ./swish-e -w 'not dksksks' -m 1
# SWISH format: 2.1-dev-24
# Search words: not dksksks
# Number of hits: 23840
# Search time: 0.069 seconds
# Run time: 0.074 seconds


This is pushing the limit for little old swish, but here's indexing a few
more very small xml files (~150 bytes each)

3830016 files indexed.  582898349 total bytes.
Elapsed time: 00:48:22 CPU time: 00:44:01

/plug

Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [modperl-site design challenge]

2001-11-26 Thread Bill Moseley

At 11:14 AM 11/26/01 -0500, John Saylor wrote:
 * While the design might not be to cool from the designers point of view, I
 like it because it is simple, doesn't use HTML-tables, is small and fast
 (/very/ little HTML-overhead) and accessible to disabled people.

But that *is* cool. I think it's very well designed. To me, usability is
the main design goal.  Keep up the good work!

Does it need to render well in old browsers?  (e.g. netscape 4.08)

There's a lot of old browsers out there, but maybe anyone looking at
mod_perl would be a bit more up to date...



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Apache::Registry HEAD request also return document body

2001-11-23 Thread Bill Moseley

At 11:43 AM 11/23/2001 +, Jean-Michel Hiver wrote:

PROBLEM HERE
A head request should * NOT * return the body of the document

You should check $r-header_only in your handler. 

http://thingy.kcilink.com/modperlguide/correct_headers/3_1_HEAD.html



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Apache::AuthCookie login faliure reason

2001-11-23 Thread Bill Moseley

At 04:09 PM 11/23/2001 +1100, simran wrote: 

Hi All, 
  
I am having some trouble getting Apache::AuthCookie (version 3 which i
believe is the latest version) to do what want:
  
What i want is: 
  
* To be able to give the user a reson if login fails
  - eg reason: * No such username
* Your password was incorrect
  
Has anyone else come across the same requirement/issue, and how have you
solved it? 


Apache::AuthCookieURL does that.  IIRC, it sets a cookie with the failure
reason that's returned from authen_cred call.








Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Apache::Registry HEAD request also return document body

2001-11-23 Thread Bill Moseley

At 02:53 PM 11/23/01 +, Jean-Michel Hiver wrote:
  My only concern is that I thought that Apache::Registry was designed
  to act as a CGI emulator, allowing not so badly written CGIs to have
  mod_perl benefits without having to change them.

Right, sorry I completely missed the Registry part!

Try HEAD on this script.

#!/usr/local/bin/perl -w
use CGI;

my $q = CGI-new;

print $q-header, $q-start_html,
  join( BR\n, map { $_ : $ENV{$_} } keys %ENV),
  $q-end_html;



  If I have to use the $r object (and therefore the Apache module), then
  it means that the scripts won't be able to run as standalone CGIs...

Am I right?

Right,  maybe that's a good thing ;) (I acutally mix mod_perl code in
applicatins that will run under both.)



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Apache::Registry HEAD request also return document body

2001-11-23 Thread Bill Moseley

At 03:21 PM 11/23/01 +, Jean-Michel Hiver wrote:
Duh... That's a lot of info for a head request :-)

Yes, and that's what I get for using HEAD to test!  Yesterday's holliday
doesn't help todays thinking.

How about patching Apache::Registry?

Oh, Stas, of course, just posted a better solution.  Maybe I'll have better
luck repairing my car today.



Bill Moseley
mailto:[EMAIL PROTECTED]



[OT] Re: Seeking Legal help

2001-11-22 Thread Bill Moseley

At 03:21 PM 11/21/01 -0800, Medi Montaseri wrote:
I did some work (about $25000 worth) for a customer and I'm having
problem collecting. 

This has been beaten to death on the list, but... (and I'm not a lawyer,
but I drink beer with one),

If you think they are going Chapter 11, then you may want to try to bargain
down to some amount to get something, so you are not on their list of
creditors.  

When they do file, if that's the case, they have to notify the court of
their creditors and then the court is suppose to notify you.  You must then
file a proof of claim, and get in line with everyone else.  If you think
they might fail to list you as a creditor when they file, contact the court
every few weeks and check if they have already filed, and file your proof
of claim.  Then at least you might get a penny on the dollar...

$25K is a bad number, in that it's too big for small claims court, and it's
too little to get much help from lawyers in a law suit, I'd guess.  Ask
them if they want to pay partially in hardware and you might get a good
idea of their direction ;).

Good luck,



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Cookie authentication

2001-11-15 Thread Bill Moseley
At 02:02 PM 11/15/01 -0600, John Michael wrote: 
>>>>
This may seem off subject but, If you bare with me, I don't think it is.  I am interested in using the cookie based system referred to in the programming the apache api book but oftend wonder this.
Can you count on everyone to use cookies.


Sometime back I wrote a module based on Apache::AuthCookie called Apache::AuthCookieURL that uses cookies, or falls back to munged URLs if cookies were not enabled.  It's on CPAN.

I wrote it for a site where people come in from public libraries.  The requirement was that it had to do sessions even if cookies were disabled (as it was common for the public libraries to have cookies disabled). 

It's been a while since I looked at it.  I had added a way to disable the authen requirement for areas of the site (or everywhere), so it could be used just for dealing with sessions.

Do be careful about session hijacking.





Bill Moseley
mailto:[EMAIL PROTECTED] 

Re: Cookie authentication

2001-11-15 Thread Bill Moseley

At 05:20 PM 11/15/01 -0600, John Michael wrote:
Thanks.
I did not know that you could verify that someone has cookies turned on.
Can you point me to where i can find out how to do this?  Is there a
variable that you can check?

You set a cookie and do a redirect (if you need the cookie right away).  If
it comes back with a cookie then they are enabled.



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: how to install the XML::LibXSLT along with libxslt?

2001-11-14 Thread Bill Moseley
At 08:03 PM 11/14/01 -0800, SubbaReddy M wrote: 

Maybe a question for the libxml2 list instead of mod_perl?

So, while installing libxslt-1.0.6 
i am getting error atlast, that is  " checking for libxml libraries >= 2.4.7... ./configure: xml2-config: command not found "

Did you make install libxml2?

> which xml2-config
/usr/local/bin/xml2-config
>>>>




Bill Moseley
mailto:[EMAIL PROTECTED] 

[OT] Data store options

2001-11-08 Thread Bill Moseley

Hi,

verbose
I'm looking for a little discussion on selecting a data storage method, and
I'm posting here because Cache::Cache often is discussed here (along with
Apache::Session).  And people here are smart, of course ;).

Basically, I'm trying to understand when to use Cache::Cache, vs. Berkeley
DB, and locking issues.  (Perrin, I've been curious why at etoys you used
Berkeley DB over other caching options, such as Cache::Cache).  I think
RDBMS is not required as I'm only reading/writing and not doing any kind of
selects on the data -- also I could end up doing thousands of selects for a
request.  So far, performance has been good with the file system store.

My specifics are that I have a need to permanently store tens of thousands
of smallish (5K) items.  I'm currently using a simple file system store,
one file per record, all in the same directory.  Clearly, I need to move
into a directory tree for better performance as the number of files increases.

The data is accessed in a few ways:

1) Read/write a single record
2) Read anywhere from a few to thousands of records in a request. This
   is the typical mod_perl-based request.  I know the record IDs that I
   need to read from another source.  I basically need a way to get some
   subset of records fast, by record ID.
3) Traverse the data store and read every record.

I don't need features to automatically expire the records.  They are
permanent.

When reading (item 2) I have to create a perl data structure from the data,
which doesn't change.  So, I want to store this in my record, using
Storable.pm.  That can work with any data store, of course.

It's not a complicated design.  My choices are something like:

1) use Storable and write the files out myself.
2) use Cache::FileCache and have the work done (but can I traverse?)
3) use Berkeley DB (I understand the issues discussed in The Guide)

So, what kind of questions and answers would help be weigh the options?

With regard to locking, IIRC, Cache::Cache doesn't lock, rather writes go
to a temp file, then there's an atomic rename.  Last in wins.  If updates
to a record are not based on previous content (such as a counter file) is
there any reason this is not a perfectly good method -- as opposed to flock?

Again, I'm really looking more for discussion, not an answer to my specific
needs.  What issues would you use when selecting a data store method, and why?

/verbose

Thanks very much,





Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Cache::* and MD5 collisions [was: [OT] Data store options]

2001-11-08 Thread Bill Moseley

At 10:54 AM 11/08/01 -0800, Andrew Ho wrote:
For example, say your keys are e-mail addresses and you just want to use
an MD5 hash to spread your data files over directories so that no one
directory has too many files in it. Say your original key is
[EMAIL PROTECTED] (hex encoded MD5 hash of this is RfbmPiuRLyPGGt3oHBagt).
Instead of just storing the key in the file
R/Rf/Rfb/Rfbm/RfbmPiuRLyPGGt3oHBagt.dat, store the key in the file
[EMAIL PROTECTED] Presto... collisions are impossible.

That has the nice side effect that I can run through the directory tree and
get the key for every file.  I do need a way to read every key in the
store.  Order is not important.



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [OT] search engine module?

2001-10-16 Thread Bill Moseley

At 02:04 PM 10/16/2001 +0100, Ged Haywood wrote:
  Plus lots of other stuff like Glimpse and Swish which interface to
C-based
  engines.
 
 I've had good luck with http://swish-e.org/2.2/

Please make sure that it's possible to do a plain ordinary literal
text string search.  Nothing fancy, no case-folding, no automatic
removal of puctuation, nothing like that.  Just a literal string.

Last night I tried to find perl -V on all the search engines
mentioned on the mod_perl home page and they all failed in various
interesting ways.

I assume it's how the search engine is configured.  Swish, for example, you
can define what chars make up a word.  Not sure what you mean by literal
string.  For performance reasons you can't just grep words (or parts of
words), so you have to extract out words from the text during indexing.
You might define that a dash is ok at the start of a word, but not at the
end and to ignore trailing dots, so you could find -V and -V. (at the end
of a sentence).

Some search engines let you define a set of buzzwords that should be
indexed as-is, but that's more helpful for technical writing instead of
indexing code.

Finally, in swish, if you put something like perl -V in quotes to use a
phrase search it will find what you are looking for most likely, even if
the dash is not indexed.



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: Mod_perl component based architecture

2001-10-16 Thread Bill Moseley

I've been looking at OpenInteract, too.  I've got a project where about 100
people need to edit records in a database via a web-based interface.  And
I'd like history tracking of changes (something like CVS provides, where
it's easy to see diffs and to back out changes).  And I need access control
for the 100 people, along with tracking per user of how many changes they
make, email notification of changes, administrative and super-user type of
user levels, and bla, bla bla, and so on.  Normal stuff.

I'm just bored with html forms.  Seems like I do this kind of project too
often -- read a record, post, validate, update...  Even with good
templating and code reuse between projects I still feel like I spend a lot
of time re-inventing the (my) wheel.  Will an application framework bring
me bliss?  I'm sure this is common type of project for many people.  What
solutions have you found to make this easy and portable from project to
project?



Bill Moseley
mailto:[EMAIL PROTECTED]



Re: [request] modperl mailing lists searchable archives wanted

2001-10-09 Thread Bill Moseley

Hi Stas,

I just updated the search site for Apache.org with a newer version of
swish.  The context highlighting is a bit silly, but that can be fixed.
I'm only caching the first 15K of text from each page for context
highlighting.

http://search.apache.org

It seems reasonably fast (it's not running under mod_perl currently, but
could -- if mod_perl was in that server ;).

It takes about eight or nine minutes to reindex ~35,000 docs on *.apache.org
so the mod_perl list (and others) shouldn't too much trouble, I'd think,
with smaller numbers and smaller content.

It doesn't do incremental indexing at this point, which is a draw back, but
indexing is so fast it normally doesn't matter (and there's an easy
work-around for something like a mailing list to pickup new messages as
they come in during the day).

Swish-e can also call a perl program which feeds docs to swish.  That makes
it easy to parse the email into fields for something like:

  http://swish-e.org/Discussion/search/swish.cgi

which looks a lot like the Apache search site...

But, what would be needed is a good threaded mail archiver, which there are
many to pick from, I'd expect.

Some 
archives are browsable, but their search engines simply suck. e.g. 
marc.theaimsgroup.com I think is the only one that archives 
[EMAIL PROTECTED], but if you try to seach for perl string like 
APR::Table::FETCH it won't find anything. If you search for
get_dir_config it will split it into 'get', 'dir', 'config' and give you 
a zillion matches when you know that there are just a few.

On swish you could say : and _ are part of words and those would index
as full words.  Or, just simply search for phrase: get_dir_config and it
would search for the phrase get dir config which would probably find what
you want.

Maybe : and _ are ok in words, but you have to think carefully about
others.  It's more flexible to split the words and use phrases in many cases.



Bill Moseley
mailto:[EMAIL PROTECTED]



Phase for controlling network input?

2001-09-26 Thread Bill McGonigle

I'm hoping this is possible with mod_perl, since I'm already familiar 
with it and fairly allergic to c, but can't seem to figure out the right 
phase.

I've been seeing log files recently that point to a certain DDOS attack 
brewing on apache servers.  I want to write a module that keeps a timer 
for the interval from when the apache child gets a network connection to 
when the client request has been sent.

I need a trigger when a network connection is established and a trigger 
when apache thinks it has received the request (before the response).

PerlChildInitHandler seems too early, since the child may be a 
pre-forked child without a connection.  PerlPostReadRequest seems too 
late since I can't be guaranteed of being called if the request isn't 
complete, which is the problem I'm trying to solve.  I could clear a 
flag in PerlPostReadRequest, but that would imply something is 
persisting from before that would be able to read the flag.

Maybe I'm think about this all wrong.  Any suggestions?

Thanks,
-Bill




  1   2   3   4   >