I'm having a memory problem with Perl in Panther that
I didn't have in Jaguar.  When I try to read a large
text file using the line input operator (<>), I get an
"out of memory" error:

*** malloc: vm_allocate(size=8421376) failed (error
code=3)
*** malloc[5576]: error: Can't allocate region
Out of memory!

The strange thing is that I'm not slurping up the
whole file, I'm reading it line by line.  But Perl
seems to be slurping the whole thing into memory
before even processing the first line.  But even so,
the file isn't large enough to use up all physical and
virtual memory. I know it's not an issue of using the
wrong linefeed character, as this script works fine in
Jaguar and processes the same text file line-by-line
without a problem.  Here's the structure of my code:

open (FILE1, "file1.txt");   # 300MB Mac text file
$/ = "\r";
foreach my $line (<FILE1>) {
   chomp $line;
   (do something...)
}
close FILE1;

open (FILE2, "file2.txt");  # 300MB Mac text file
$/ = "\r";
foreach my $line (<FILE2>) {
    chomp $line;
    (do something)
}
close FILE2;

Here's how top displays Perl's memory usage after
opening the first 300MB file, on a machine with 2 gigs
of RAM and a hundred gigs of free HD space:

 PID COMMAND      %CPU   TIME   #TH #PRTS #MREGS RPRVT
 RSHRD  RSIZE  VSIZE
 5576 perl        63.2%  2:56.60   1    13   364 
1.38G- 1.37M   397M+ 1.57G 

After processing all of the lines of the first file
(during which the memory usage increases only
slightly), the script tries to read the second file
and quits with the memory error before it processes
the first line of the second file.

Is this a bug in Perl 5.8.1?  Is there a way to force
Perl to not slurp up the whole file at once, or at
least to release the memory used up by the first file
before reading the second?

Thanks!

XB


        
                
__________________________________
Do you Yahoo!?
Win a $20,000 Career Makeover at Yahoo! HotJobs  
http://hotjobs.sweepstakes.yahoo.com/careermakeover 

Reply via email to