Rob Das wrote at Thu, 03 Oct 2002 15:01:31 +0200:

> Hi all:
> 
> I have a script that is merging many files into a new file, by reading one
> at a time and adding to the new one. The error I received was "Out of memory
> during request for 1016 bytes, total sbrk() is 1186891848 bytes!" (I did not
> add the "!"...). This is on a Sun box with 4 Gig, typically between 1 and 2
> Gig free. Is there anything I can do to reduce memory usage? None of my
> error messages were displayed, so I'm not even sure which statement caused
> the error (I check for errors on all but the "closedir" and
> "close(INFILE)"). I should mention that I'm processing 100,000 files of an
> average of about 6 K each. (Plan B is to reduce the number of files being
> processed.) Here is the code (stripped of the error checking):
> 
> opendir(INDIR, $indir);
> @logfile=grep(/$mask/i, readdir INDIR); 
> closedir(INDIR);

Here the whole INDIR is read at one time.
I assume that the file still is little enough,
so the problem must be somewhere else.

> $nbrfiles=@logfile; # number of files matching mask
> open(OUTFILE, ">$outdir$outfile");
> for ( $ctr=0; $ctr<$nbrfiles; $ctr++ ) {
>     open(INFILE, "$indir$logfile[$ctr]");
>     print OUTFILE <INFILE>;

That's only a shorthand version for

my @temp = (<INFILE>); # reads the whole file at once !!
print OUTFILE @temp;

>     close(INFILE);
> } 
> close(OUTFILE);

I would suggest a line by line copying instead:

foreach my lfile (@logfile) {
   open INFILE, "$indir$lfile" or die "Can't open $lfile: $!";
   while (<INFILE>) {
      print OUTFILE;
   }
   close INFILE;
} 


Greetings,
Janek


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to