On Jun 14, 2006, at 6:40 PM, Hardly Armchair wrote:

Hello All,

I was wondering if it is more efficient (in terms of speed and processor load) to have two different scripts of approximately the same size called to handle two different functions, or to have one large script handle all cgi functions using subroutines. Or perhaps these situations are equivalent.

I asked a similar question a few months back ("How big is too big?").

After learning a lot from the responses and where they led me I started looking more at CGI:: Application.

The general theory I get from this framework (as it applies to your question) is that to help with management of subroutines you should create scripts (modules) that hold subroutines that perform similar tasks. No more than 10 subroutines in a script was the rule of thumb as I recall.

Someone here mentioned that a Perl/CGI script that contains 1000 lines is probably about as big as you'd want one to get. The script I'm re-factoring to use CGI:: Application is now over 10,000 lines (with comments). It still performs pretty well, but never sees huge amounts of requests.

I completely agree with Ovid's comment, "do NOT worry about performance unless you have an extremely good reason to do so." That's one reason my script got so big. Performance still is not an issue for me, but management is becoming one.

The "One big one versus many small ones" question seems best answered by personal preference, up to a point. For me, management was getting to be a pain.

Now I'd strongly recommend CGI:: Application to anyone working on a perl/cgi app that will get bigger than that 1000 line max that was previously suggested or needs features easily provided by the framework and its plug-ins.

Kindest Regards,

--
Bill Stephenson
417-546-8390


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to