Ah yes, but don't forget that to get this speed, you are sacrificing
memory. You now have another locally scoped variable for perl to keep
track of, which increases memory usage and general overhead (allocation
and garbage collection). Now, those, too, are insignificant with one
use, but
Issac Goldstand wrote:
Ah yes, but don't forget that to get this speed, you are sacrificing
memory. You now have another locally scoped variable for perl to keep
track of, which increases memory usage and general overhead (allocation
and garbage collection). Now, those, too, are
I know there is some good reference material for mod_perl out there, just
can't remember where. Anybody?
http://perl.apache.org/guide/
Best regards,
Axel
Hi list,
I had problems with a script that went nuts and took 65MB memory and
alot of cpu. To track this script down I thought Apache:VMonitor would
be perfect, unfortenately I ran into some weird promlems (it said there
was an error in mod_perl.h) and i know gcc might be broken on this
machine
2 quick notes.
Have you seen the epigone archives? I'm sure I've seen mention
of SIGPIPE in this scenario some time before.
Upgrade! You are using old versions of apache, perl and mod_perl.
-Original Message-
From: Balazs Rauznitz [mailto:[EMAIL PROTECTED]]
Sent: Friday, January 25,
Jon Molin wrote:
Hi list,
I had problems with a script that went nuts and took 65MB memory and
alot of cpu. To track this script down I thought Apache:VMonitor would
be perfect, unfortenately I ran into some weird promlems (it said there
was an error in mod_perl.h) and i know gcc might be
This project's idea is to give stright numbers for some definitely bad
coding practices (e.g. map() in the void context), and things which vary
a lot depending on the context, but are interesting to think about (e.g.
the last example of caching the result of ref() or a method call)
I
Stas Bekman wrote:
It's actually easy, take a look at the Apache::SizeLimit or
Apache::GTopLimit, look at the cleanup handler that they register. Now
take this handler and dump whatever you need to the file or error_log
when you find that the process was taking too much memory.
Take a
Jon Molin wrote:
Stas Bekman wrote:
It's actually easy, take a look at the Apache::SizeLimit or
Apache::GTopLimit, look at the cleanup handler that they register. Now
take this handler and dump whatever you need to the file or error_log
when you find that the process was taking too much
[snip]
Another question, do you (or anyone else for that matter) know how the
accesslog works? (and also why it does work like it does) It seems it
prints after the request is done, otherwise could that easily be used
for checking the parameters, and not only loging.
You probably
Hello,
I'm using mod_perl 1.21 on a host where i don't have the option of upgrading
mod_perl. Is there an alternative way to use PerlSetVar to simulate the
effect of PerlAddVar. I want to create a variable, namely MasonCompRoot,
that has two entries in it.
Thanks,
Vlad
The point is that I want to develop a coding style which tries hard to
do early premature optimizations.
We've talked about this kind of thing before. My opinion is still the same
as it was: low-level speed optimization before you have a working system is
a waste of your time.
It's much
Vladislav Shchogolev wrote:
Hello,
I'm using mod_perl 1.21 on a host where i don't have the option of upgrading
mod_perl. Is there an alternative way to use PerlSetVar to simulate the
effect of PerlAddVar. I want to create a variable, namely MasonCompRoot,
that has two entries in it.
I
On Fri, 25 Jan 2002, Geoffrey Young wrote:
I think I just read in the eagle book the other day that suggested something like
PerlSetVar MasonCompRoot foo:bar
my @roots = split :, $r-dir_config('MasonCompRoot');
or whatever...
Except that the code that read the dir_config is part of the
Hi All,
A big debate is raging on the Bricolage development list WRT CVS
configuration and application testing.
http://www.geocrawler.com/mail/thread.php3?subject=%5BBricolage-Devel%5D+More+on+Releaseslist=15308
It leads me to a question about testing. Bricolage is a monster
application, and
Is anyone familiar with how to go about setting up a test suite for a
web UI -- without spending an arm and a leg? (Remember, Bricolage is an
OSS effort!).
Yes, it's very easy. We did this using student labor, because it is
an excellent project for students and it's probably cheaper. It's
There are many web testers out there. To put it bluntly, they don't
let you write maintainable test suites. The key to maintainability is
being able to define your own domain specific language.
Have you tried webchat? You can find webchatpp on CPAN.
On Fri, 2002-01-25 at 10:12, Perrin Harkins wrote:
Have you tried webchat? You can find webchatpp on CPAN.
Looks interesting, although the documentation is rather sparse. Anyone
know of more examples than come with it?
Thanks,
David
--
David Wheeler
On Fri, 2002-01-25 at 09:08, Perrin Harkins wrote:
snip /
It's much better to build your system, profile it, and fix the bottlenecks.
The most effective changes are almost never simple coding changes like the
one you showed, but rather large things like using qmail-inject instead of
SMTP,
Have you tried webchat? You can find webchatpp on CPAN.
Just had a look. It appears to be a rehash of chat (expect) for the
web. Great stuff, which is really needed and demonstrates the power
of Perl for test scripting.
But...
This is a bit hard to explain. There are two types of XP
On 25 Jan 2002, David Wheeler wrote:
On Fri, 2002-01-25 at 09:08, Perrin Harkins wrote:
snip /
It's much better to build your system, profile it, and fix the bottlenecks.
The most effective changes are almost never simple coding changes like the
one you showed, but rather large things
This may sound strange, but bear with me. I want to create an ApacheHandler
that will pull all the files in a virtualhost, not from the filesystem, but
from an RDBMS (built on PostgreSQL). This includes .htaccess files, binary
files (e.g. pdf and images) and text files (e.g. html and xml). I'm
On Fri, 25 Jan 2002 21:15:54 + (GMT)
Matt Sergeant [EMAIL PROTECTED] wrote:
With qmail, SMTP generally uses inetd, which is slow, or daemontools,
which is faster, but still slow, and more importantly, it anyway goes:
perl - SMTP - inetd - qmail-smtpd - qmail-inject.
So with going
On Fri, 2002-01-25 at 13:15, Matt Sergeant wrote:
With qmail, SMTP generally uses inetd, which is slow, or daemontools,
which is faster, but still slow, and more importantly, it anyway goes:
perl - SMTP - inetd - qmail-smtpd - qmail-inject.
So with going direct to qmail-inject, your
Paul Mineiro wrote:
right. i probably should've mentioned earlier that CGAT x 5 is
really fast in both mod_perl and command line.
if anybody wants my actual $seq data, please let me know.
i neglected to mention something big: the production version is
identical but using perl
Stas Bekman [EMAIL PROTECTED] writes:
I even have a name for the project: Speedy Code Habits :)
The point is that I want to develop a coding style which tries hard to
do early premature optimizations.
I disagree with the POV you seem to be taking wrt write-time
optimizations. IMO,
Rob Mueller (fastmail) wrote:
I recently had a similar problem. A regex that worked fine in sample code
was a dog in the web-server code. It only happened with really long strings.
I tracked down the problem to this from the 'perlre' manpage.
WARNING: Once Perl sees that you need one of
I suppose it depends on what you want out of testing.
Frequently, unit testing is OK in simple applications. But in an
application whose job it is to communicate with a mainframe or back-end
databases, frequently the tests you might perform are based on some
previous persistent state of the
Gunther Birznieks writes:
the database to perform a test suite, this can get time consuming and
entails a lot of infrastructural overhead.
We haven't found this to be the case. All our database operations are
programmed. We install the database software with an RPM, run a
program to build
On Sat, 26 Jan 2002, Gunther Birznieks wrote:
I agree that testing is great, but I think it is quite hard in practice.
Also, I don't think programmers are good to be the main people to write
their own tests. It is OK for programmers to write their own tests but
frequently it is the user
Joe Schaefer wrote:
mod_perl specific examples from the guide/book ($r-args vs
Apache::Request::param, etc)
Well, I've complained about that one before, and since the
guide's text hasn't changed yet I'll try saying it again:
Apache::Request::param() is FASTER THAN Apache::args(),
Gunther Birznieks writes:
the database to perform a test suite, this can get time consuming
and
entails a lot of infrastructural overhead.
We haven't found this to be the case. All our database operations are
programmed. We install the database software with an RPM, run a
program to
Perrin Harkins wrote:
The point is that I want to develop a coding style which tries hard to
do early premature optimizations.
We've talked about this kind of thing before. My opinion is still the same
as it was: low-level speed optimization before you have a working system is
a waste of
David Wheeler wrote:
Hi All,
A big debate is raging on the Bricolage development list WRT CVS
configuration and application testing.
http://www.geocrawler.com/mail/thread.php3?subject=%5BBricolage-Devel%5D+More+on+Releaseslist=15308
It leads me to a question about testing. Bricolage
On Sat, 26 Jan 2002 00:23:40 -0500
Perrin Harkins [EMAIL PROTECTED] wrote:
But what about the actual data? In order to test my $product-name()
method, I need to know what the product name is in the database. That's
the hard part: writing the big test data script to run every time you
want
stas02/01/25 00:17:58
Modified:t/apache .cvsignore
Log:
- ignore file adjst.
Revision ChangesPath
1.4 +2 -0 modperl-2.0/t/apache/.cvsignore
Index: .cvsignore
===
RCS file:
36 matches
Mail list logo