On 25/11/2002 23:52:21 Jacob Schroeder wrote:

>I'm new to perl but I've been working on a script that will parse a large
>amount of text and while it's going through it, it will store data in a
few
>different hashes I have (a one dimensional hash, and two-two dimensional
>hashes).  Once I read in all the data, I then sort the hashes and output
all
>of this to a set of log files.
>

Pay attention, folks (Schwern, Vladi) !

>
>Here's the main chunk of my code that start the text coming in...
># Build up the command string appropriately, depending on what options
># have been set.
>my $command =
>($rlog_module ne "") ? "cvs -n -d $cvsdir rlog $rlog_module" : "cvs
>log";
>print "Executing \"$command\"\n" if $debug;
>
>open (CVSLOG, "$command |") || die "Couldn't execute \"$command\"";
>while (<CVSLOG>)
>{
>....

    You are adding stuff to hashes. Hashes use memory. If the log is huge,
    it will use a huge amount of memory.
>....
>}
>

Perl will happily consume all available memory in the system.
If it's limited to 20M that's too small to be a physical limitation.
Are you running on some Unix-like OS with things like ulimit ?

--
Csaba R�duly, Software Engineer                           Sophos Anti-Virus
email: [EMAIL PROTECTED]                        http://www.sophos..com
US Support: +1 888 SOPHOS 9                     UK Support: +44 1235 559933


Reply via email to