Shoulda thought about your answer first, Doug. :-)
I see this type of message ("error at /dev/null") when my mod_perl scripts
give warnings -w style instead of $r-warn. For example, HTML::Embperl, or
Apache::Registry both do this.
The nature of the error message sez to me there is a
You are not setting your Content-Type correctly.
The response contains:
Content-Type: text/plain
This needs to be
Content-Type: text/html
to be rendered as HTML.
-Original Message-
From: Guido Moonen [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, November 07, 2000 3:06 AM
To: Modperl;
I like the idea of Apache::SizeLimit, to no longer worry about
setting MaxRequestsPerChild. That just seems smart, and might
get maximum usage out of each Apache child.
What I would like to see though is instead of killing the
child based on VmRSS on Linux, which seems to be the
because then all of your hard work before goes RIGHT out the window,
and I'm talking about a 10-15 MB difference between JUST FINE and
DEATH SPIRAL, because we've now just crossed that horrible, horrible
threshold of (say it quietly now) swapping! shudder
That won't happen if you use a
On Tue, 9 Jan 2001, Rob Bloodgood wrote:
OK, so my next question about per-process size limits is this:
Is it a hard limit???
As in,
what if I alloc 10MB/per and every now then my one of my
processes spikes
to a (not unreasonable) 11MB? Will it be nuked in mid process? Or just
RB Alright, then to you and the mod_perl community in general, since
RB I never saw a worthwhile resolution to the thread "the edge of
RB chaos,"
The resolution is that the machine was powerful enough. If you're
running your mission critical service at "the edge of chaos" then
you're not
You simply cannot come forward and say, "look, I've got this big-assed
linux box, why is my site sucking?" We don't know, and it's neither our
granted. never my intention.
i described the box only to illustrate that i (should) have sufficient HW.
The very, very best minds in production
I think that the problem here is that we've asked for more info
and he hasn't
supplied it. He's given us generics and as a result has gotten generic
answers.
I haven't been fishing for a handout of doing the work for me. I've been
trying to see what people have done. The reason for the
In my PerlAuthenHandler I need to send back the WWW-Authenticate-line.
I use $r-headers_out("WWW-Authenticate" = 'basic realm = "MyName"').
But if i returned from the Handler with "return AUTH_REQUIRED" , Apache
doesn't send this line in the header.
This is (one of) the relevant sections in
I've been getting these occassional errors from libapreq, 1 every couple
days:
[Thu Jan 25 15:54:33 2001] [error] [client 64.12.102.22] [libapreq]
unknown content-type: `applicationontent-Type:
application/x-www-form-urlencoded\'
Alright, I'm gonna toss my $.02 into this:
This
So, in my mod_perl app, I run thru each request, then blast a UDP packet to
a process on the local machine that collects statistics on my traffic:
sub send_packet {
my $r = shift;
my $packet = shift;
$r-warn("send_packet: No packet, not transmitting") if $debug
!$packet;
return
wm looks like a home directory. The default perms on the home
directory are usually 700. Try changing that to something like 755
or even 744 (it may not need execute).
Actually, the x bit on directory perms means "accessible," meaning if you
KNOW the name of the file, U can reach it at
execute (or access
for directories) (x)
drwx-x3 rlandrum devel4096 Jan 30 14:14 public_html
(701, Forbidden)
that's not what I meant, I should have been more clear.
755 on public_html
701 on ~user
so ~user is still "hidden" from general eyes
but
Thanks for the clarification. It worked perfect.
drwx-x 12 rlandrum rlandrum 4096 Feb 6 14:05 rlandrum
drwxr-xr-x3 rlandrum devel4096 Jan 30 14:14 rlandrum/public_html
Rob
execute (or access
for directories) (x)
drwx-x3 rlandrum devel
I've been using HTML::Mason under mod_perl on my site for awhile, using
0.89, and I like it lots. :-) So when the new 1.0 came out, I went to go
upgrade, and broke EVERYTHING.
Not only that, but, I haven't been able to make sense out of what Mason
wants for its dir heirarchy, anyway:
First,
I'm currently a developer for an on-line publication using Apache /
mod_perl / Mason. We currently have about six developers working on the
project and I have been running into problems with concurrent work on the
Perl libraries that power our site.
Just a few days ago, somebody suggested
On Wed, 14 Mar 2001, Perrin Harkins wrote:
On Wed, 14 Mar 2001, Issac Goldstand wrote:
I still think that the above line is confusing: It is
because mod_perl is
not sending headers by itelf, but rather your script must provide the
headers (to be returned by mod_perl). However,
Thanks for the pointers, unfortunately I've got a problem with the Shared
cache in that I need IPC::ShareLite, no problem, except it won't test ok,
I get:
PERL_DL_NONLAZY=1 /usr/bin/perl -Iblib/arch -Iblib/lib
-I/usr/lib/perl5/i386-linux -I/usr/lib/perl5 test.pl
1..8
ok 1
ok 2
Version: Apache/1.3.12 (Unix) mod_perl/1.24
What: PerlAuthenHandler returns headers without WWW-Authenticate field
Work-around: set with $r-err_header_out
It looks like you haven't fully read the book/docs/manpages/samples for auth
handling.
*All* of the code for Basic auth (i.e. browser
From the mod_perl guide:
syntax error at /dev/null line 1, near "line arguments:"
Execution of /dev/null aborted due to compilation errors.
parse: Undefined error: 0
There is a chance that your /dev/null device is broken. Try:
% sudo echo /dev/null
This is exactly the problem
I'm trying to handle an exception using an internal_redirect. I
can get it to work by redirecting to a static page, but when I try to
redirect to a modperl handler, I'm run into problems.
Here are the two versions of code (BTW, the handler works fine when I
access it directly via the
Rob, thanks for pointing me in the right direction. Your advise
helped me find a solution that works for my situation.
You're welcome!
I'm working on an API that sits between an Oracle DB and bunch of web
application programmers. Unfortunately, the programmers run their
apps under a
HOWEVER, whenever the module is actually invoked, %SECRET_KEYS is empty!
Here's the BEGIN{} block:
BEGIN {
my @keyfile_vars = grep {
$_ =~ /DBI_SecretKeyFile$/
} keys %{ Apache-server-dir_config() };
foreach my $keyfile_var ( @keyfile_vars ) {
OK, more examination reveals that:
At the time this BEGIN block is running, this call:
my @keyfile_vars = grep {
$_ =~ /DBI_SecretKeyFile$/
} keys %{ Apache-server-dir_config() };
is returning EMPTY.
Meaning it's evaling too early to see the dir_config???
What if you want to explicitly zap the KeepAlives but not terminate
the child. Example -- http chat scripts. Basically it amounts to
having KeepAlives off for the particular script but on for everything
else. How does one accomplish this.
$r-header_out(Connection = 'close');
L8r,
Rob
I'd like to use Apache::Cookie, but I'm doing some tricky things with
cookie data, which requires that I do the encoding myself. However,
every time I 'bake' a cookie object, it tries to encode stuff for me. I
don't like this.
For example, if I've got cookie data that looks like 'foo%21',
Yes ... basically we want to track which company sent us the
reference when customers subscribe.
the ref=xxx where xxx will = some company id
I want to tack this on to every click so that when the user
finally submits an application we
can credit the company that gave us the customer
I
Or at the very least, two segments thereof:
domain=.org.tld
Which would be sent to any of these hosts:
www.org.tld
some.obscure.server.org.tld
even.here.org.tld
BUT NOT TO
ord.tlg
Thank you very four-borking-days-lost-forever much.
So, patient gurus
I'm having similar problems, but we think it's directly related to
Oracle. Basically, a connection is made to the Oracle database, a
transaction is started and finished, but the connection to the
database doesn't go away and the statement (at least from the oracle
side) never seems to
The way I've setup whole thing is like that : a script name restart is
called with some parameters telling him to reload one or all the
developpers environment, or the testing copy. This script would
have some
environments variables called SITE_USER and SITE_USER_PORT that will give
The
I had intended this to CC: to the list... sigh
location /foo*
AuthNamefoo control
AuthTypeBasic
PerlAuthenHandlerApache::OK
PerlAuthzHanlderWW_authz
PerlSetVarMaskGeek
requireusermaskgeeky
/location
I have a similar setup, and my
That, unfortunately doesn't tell me what causes a USR2 signal to
be sent to
Apache. Or when it's caused. I only want to reload the file when
said file
has changed. Am I supposed to do some checking against the file -M time
myself, and then send a USR2 signal myself?
USR2 only fires when
When I start getting this error, I can shutdown the httpd server, and the
machine and it will still give this error. If I wait a while(sometimes
hours,
sometimes days) it will come
back. Sometimes it is a few hours. Sometimes it is days. I have installed
Apache::DBI in hopes of a possible
So, like many of you, I've got a signup system in place for bringing on new
customers.
My signup script is reasonably straightforward. I use CGI::Validate to make
my parameters pass muster (along with a little judicious JavaScript on the
signup form), Apache::Session::Oracle to maintain state
A really simple trick would be rather than to use a cookie, if
you are saving state to DB anyway. Set a flag in the DB and test
for its existence.
sub handler{
my $s = session-new();
$s-continue();
my $flag = $s-get('flag');
if($flag){
# do something
if you can reproduce at will, use gdb:
% gdb httpd
(gdb) source mod_perl-x.xx/.gdbinit
(gdb) b Perl_croak
(gdb) run -X
run request that causes error ...
(gdb) where
stack printed here ...
(gdb) curinfo
perl filename:linenumber printed here ...
In my AuthenHandler, I run the following snippet:
# validation successful
$apr-subprocess_env(REMOTE_PASSWORD = $pass);
my $args = $apr-args || '';
$apr-args( $args . ( length $args ? '' : '' ) . pid=$pid )
unless $args =~ /pid=\d+/;
return OK;
[snip]
I'm using Apache::Request, for the sole
purpose
of having easier access to the parameters. Except that it turns out
Apache::Request's param() method does NOT support *setting* parameters,
only
*getting* them. sigh
the
$apr-param('foo' = [qw(one two three)]);
example in the
maybe storing 'last-access-time' on the server, instead of in
the client-side, via cookie, would solve this snafu?
But if you want to give out a new cookie on every request ?
How would you prevent them from copying or tampering with the contents?
a MD5-hash would stop them from changing
Changing:
warn r-uri is undef unless defined $r-uri; debugging?!?!?
my $subr = $r-lookup_uri($r-uri); # uri is relative to doc root
To:
$uri = $r-uri;
warn \$uri is undef unless defined $uri; debugging?!?!?
my $subr = $r-lookup_uri($uri); # uri is
me, on the other hand, i don't see the problem with
on incoming request
if has-cookie 'session'
{
update serverside 'accesstime' for session[this] to NOW
Oh yeah? HOW???
if not-modified-since
Hi, I'm building a new box intended to be a mod_perl/database machine, and
in the interests of making it as up-to-date as possible, I installed RedHat
7.1, then upgraded to perl 5.6.1.
Next step, of course, is to hit CPAN and install the basics, starting with
Bundle::CPAN.
But Net::Telnet barfs
Heres what I did:
I had many scripts in one dir that shared many things; subroutines, global
variables and modules. I wanted to clean things up, so I created a module
called global.pm structured like this:
snip
The custom stuff scripts all end in 1;, and are loaded with my custom
Jay Jacobs wrote:
I don't see any glue-sniffing symptoms from choosing
embedded html in perl over embedded perl in html.
Unless, of course, you're the graphic artist and you've been tasked
with changing the look and feel of the application using embedded
perl (which you, as the
As for SQL, I just wish people would expand their horizons a little
and start doing a bit of reading. There are so many different ways
to avoid embedding SQL in application code and I sincerely wish
programmers would THINK before just coding... it's what
differentiates scripters from
However, now my logs are loaded with a ton of subroutine redefined
warnings
(which is normal I suppose?). I can certainly live with this in a
development environment, but thought I would check to see if it is
expected,
and if it can be turned off while still enabling Reload.
Well, first of
only if you code it the way you did below, which isn't terribly portable.
see http://perl.apache.org/guide/perl.html#use_require_do_INC_and
Ahem, PerlModule is a wrapper around the perl builtin require(). One
presumes that perl knows where it lives if perl can successfully require()
it.
Well, it should be documented somewhere in the guide, or
presumable in
Apache::DBI.pod, that one should *only*
PerlModule Apache::DBI
Since it's pointless in startup.pl (right?).
I think you need to think that one through a bit more :)
I disagree. I *did*
startup.pl cannot be run from the command line when it
contains apache server specific modules.
But you can put those (Apache specific) modules in your httpd.conf instead
as
PerlModule Apache::DBI Apache::Status
and avoid compilation warnings in startup.pl.
But you should clearly note this,
AB Untrue. We ship mod_perl in Solaris 8 as a DSO, and it works
fine.
I apologize. Let me qualify my original statement. In general, you
want to compile mod_perl statically on Solaris 2.6 or 2.7 because
in many instances, it core dumps when built as a DSO. FWIW, my
particular
No need for an apology :-) The trick is to build perl using the
Solaris malloc (-Dusemymalloc as a flag to Configure), then apache,
mod_perl and perl all agree on who manages memory.
Might I suggest that this golden piece of information find it's
way into the guide? It's so rare
-Original Message-
From: Rob Bloodgood [mailto:[EMAIL PROTECTED]]
Sent: Thursday, August 16, 2001 11:20 AM
To: Stas Bekman
Cc: mod_perl
Subject: RE: Children dying
sigh... I didn't see the other thread that spawned from my orignal post...
rendering this reply redundant. Apologies.
On Wed, Aug 22, 2001 at 09:42:59AM -0400, Perrin Harkins wrote:
Are you using Apache::DBI? Are you opening a connection in
the parent
process (in startup.pl or equivalent)?
Yes, yes.
Don't open a connection during startup. If you do, it will be
shared when
Apache forks, and
cp apaci/perl_config ../apache_1.3.20/src/modules/perl/perl_config
^^
the file is copied all right
[snip]
Creating Makefile
Creating Configuration.apaci in src
+ id: mod_perl/1.26
+ id: Perl/v5.6.0 (linux)
So, once upon a time, I bought the Eagle and realized I had purchased a
small slice of heaven.
One of the shiny golden nuggets I received from said slice was a shared
memory cache. It was simple, it was elegant, it was perfect. It was also
based on IPC::Shareable. GREAT idea. BAD juju.
The
One of the shiny golden nuggets I received from said slice was a
shared memory cache. It was simple, it was elegant, it was
perfect. It was also based on IPC::Shareable. GREAT idea. BAD
juju.
Just use Cache::Cache. It's faster and easier.
Now, ya see...
Once upon a time, not many
my sample code, from my last message, was incomplete... you should be shure
to
return OK;
when the authentication is successful... sigh
L8r,
Rob
The _session_id is used as the seed for the locking semaphore.
*IF* I understood the requirements correctly, the _session_id has
to be the same FOR EVERY PROCESS in order for the locking to work
as desired, for a given shared data structure.
Only if you want to lock the whole thing,
Uhh... good point, except that I don't trust the Cache code. The AUTHOR
isn't ready to put his stamp of approval on the locking/updating.
That sort of hesitancy is typical of CPAN. I wouldn't worry about it. I
think I remember Randal saying he helped a bit with that part. In my
Attempt to free unreferenced scalar during global destruction.
Attempt to free unreferenced scalar during global destruction.
Attempt to free unreferenced scalar during global destruction.
Attempt to free unreferenced scalar during global destruction.
Out of memory!
Callback called exit.
Also, look into the MaxServers settings, and memory calculations in the
Guide:
http://perl.apache.org/guide/config.html#MinSpareServers_MaxSpareServers_
And especially
http://perl.apache.org/guide/performance.html#Choosing_MaxClients
GOOD LUCK!
L8r,
Rob
i think you may have to mount it
mount -t smb -o username=user,password=pass //ntserver//disk7
/mnt/smbshare
then just add /mnt/smbshare to doc root!
Except that, to the best of my knowledge, Samba can only mount to regular
mount points on Linux.
Rob
#!/usr/bin/perl -w
use Disclaimer
We have a couple openings doing intense and interesting mod_perl work
here at Red Hat. Formal description is below. Key skills are perl,
mod_perl, apache, and DBI (especially Oracle). Must relocate to
Research Triangle Park, North Carolina.
If only Red Hat was in Oregon... sigh.
L8r
Rob
#set the content type
$big_r-content_type('text/html');
$big_r-no_cache(1);
# some more code
return OK;
You *are* remembering to do
$r-send_http_header();
somewhere in (some more code), arent you?
L8r,
Rob
#!/usr/bin/perl -w
use Disclaimer
You must include code to deal with the fact that you may have already
opened a popup window. Something like this:
That is simply not true. window.open() with a named window ('popupwin', in
your example) ALWAYS reuses that window, on every browser I've ever been
able to test. The second call
Does anyone know where I can find documentation to install
and configure
Apache::AuthCookie? The docs that come with it are thin and
do not provide
much information.
you're kidding, right?
[geoff@jib Apache-AuthCookie-2.011]$ perldoc AuthCookie.pm | wc -l
462
Verbiage and
How i expected the ErrorDocument directive to behave was as
follows: WHEN there was an error 401 (ie the user had logged in 3
times and failed) there would be an error page shown (in this case
it would be /error/401). But instead what seems to be happening as
soon as a user goes to an
: ) No problem, I guess I am unsure if this is the proper way
to setup an
Access, Authen, Authz handler. When I use this configuration my
'handler()'
method does not get called and I get an error in the logs:
This is *not* the correct way to invoke it.
Directory
for example if the protected url was http://www.site.com/ the user
would be redirected to http://www.site.com/error/401 for the error
message.. and because its protected wouldnt display the custom error
page instead displaying the following error Additionally, a 401
Authorization Required
Another powerful tool for tracking down performance problems is perl's
profiler combined with Devel::DProf and Apache::DProf. Devel::DProf
is bundled with perl. Apache::DProf is hidden in the Apache-DB package
on CPAN.
Ya know the place in my original comment where I was optimizing a
un/re subscribed to a different addy, THIS IS JUST A TEST!
Uhh... the platypus, the wombat, the tazmanian devil, and the emu.
-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, January 30, 2002 1:54 PM
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Subject: Re: New mod_perl Logo
In a message dated
2.If the answer to the above question is YES? The
Handler will add headers,footers for everything. What
do I need to do to apply the handler logic just to the
requested page and return the remaining files that are
needed to complete the requested page as they are?
In the Eagle book (as well
Ditto here. Working quite well on fairly high volume servers.
Hrmm how interesting. My Apache is built with PHP (with DOM, MySQL, and
Postgres) and mod_perl. With mod_gzip enabled it simply segfaults on
every single request.
have you looked at the work at http://www.apachetoolbox.com/ ?
I am using Apache::Session with Postgresql. Unfortunately I had
never worked with a huge amount of data before I started to program
something like a (little) web application. I happily packed
everything in the session(s-table) that might be of any use. It
hit me hard that it takes a
more information is at:
http://httpd.apache.org/apreq/
Am I the only one that noticed that the web page thinks 1.0 was released 4
months before 0.33? :-)
News
February 21, 2001 - libapreq-1.0 was released.
June 19, 2001 - libapreq-0.33 was released.
December 23, 2000 - libapreq-0.31_03
Answering my own question, I stupidly forgot that I had a TransHandler up
above mucking my URLs before the Location directives got a chance
to try to match So my /foo location block was never seeing a /foo URL
Still, I'm glad to see that the old system of post to a public list and
then
Perrin Harkins wrote:
Rafael Caceres wrote:
I'm facing a dilemma here. We are testing an Oracle 9iAS installation
(Apache 1.3.19, mod_ssl 2.8.1, mod_perl 1.25 as DSO, Perl 5.005_03) on
Red Hat Linux 7.2, which itself came with Perl 5.6.0, and from your
comments, that's bad..
First of
I've always used DBI along with DBD::Oracle for Database access, and I
intend to use them along Oracle 9iAS's other capabilities.
So if I'm following you correctly, the steps involved are:
-get the 5.6.1 RPM (which doesn't seem to be in Red Hat's site anyway)
-get the Apache 1.3.19 sources
I'm running a Mason based website, and I use Emacs when I write code.
My web designers use Dreamweaver. I've designed the site so that my web
guys have to reserve me one table cell (or more than one depending on where
in the site, but you get the point) where I put a single dispatch component
At 11:30 AM -0800 3/14/02, Rob Bloodgood wrote:
The problem is, concurrency. Dreamweaver has versioning built
in... but emacs has no way to recognize it. So when I make a fix
to a file, if the designers aren't explicitly instructed to
refresh-from-the-website-via-ftp, my changes get hosed
Stas Bekman wrote:
Moreover the memory doesn't get unshared when the parent pages are
paged out, it's the reporting tools that report the wrong
information and of course mislead the the size limiting modules
which start killing the processes.
Apache::SizeLimit just reads /proc on
I've determined that it isn't the redirect causing the cookies not
to be set. If I take out the redirect, and just try to set a cookie
w/o a redirect, it still doesn't set the cookies in IE. Does M$
have any docs on how IE6 handles cookies that I can look this up on?
YES, they do.
You have
We have a mod_perl server that's under constant heavy load. In
our Apache
config we have switched HostnameLookups off using
HostnameLookups off
and for the most part, it seems to work. However, any check of
the logs or
/server-status shows that the server is *still* doing
So my question narrows down to :
How to flush on disk the cache of a tied DBM (DB_File) structure
in a way that any concurrent process accessing it in *read only* mode
would automatically get the new values as soon as they
are published (synchronisation)
Isn't that just as simple as
So I got the advisory about the Apache servers having a security hole, so I
decided to upgrade some servers. I've been on v1.25 for awhile, so decided
to upgrade to 1.27 while I was at it... big mistake.
NONE of my notes/pnotes were getting thru, on the new version.
It took me 8 or 10
-Original Message-
From: George Valpak [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, October 16, 2002 3:26 PM
To: Vegard Vesterheim
Cc: [EMAIL PROTECTED]
Subject: Re: AuthCookieDBI help please (more info)
I am still having trouble with Apache::AuthCookieDBI.
I tried moving
(sorry about the blank reply a minute ago)
I am looking into the more advanced paypal instant notification
stuff for the next version of my sw, but version one is using a
simpler approach to get it out the door. Even that paypal sw
wouldn't solve my problem, which is to make sure that the
I would like to know any such standalone servers that could
process the perl requests offline (taking requests from a file or
queue end).
I definitely would like to get fancier as my requirement is
immediate. Upon finding a server that could process the requests
away from
Wednesday, June 18, 2003, 2:13:46 AM, you wrote:
SB I've uploaded 1.03's release candidate. If nobody finds any faults, I'll
SB upload it tomorrow on CPAN. (libapreq needs to rely on 1.03 fixes to release
SB its 1.2's version).
SB Please try it out:
SB
90 matches
Mail list logo