Thanks for the suggestions. I found out that the user (my boss!) was trying
to use this for a really large file... 

Is there a reasonable check for memory I can use to see what's available
compared to the file size?

Rob

-----Original Message-----
From: Gary Stainburn [mailto:[EMAIL PROTECTED]]
Sent: Thursday, October 03, 2002 11:25 AM
To: Janek Schleicher; [EMAIL PROTECTED]
Subject: Re: Out of memory on file merge


On Thursday 03 Oct 2002 5:20 pm, Janek Schleicher wrote:
> Rob Das wrote at Thu, 03 Oct 2002 15:01:31 +0200:
> > Hi all:
> >
> > I have a script that is merging many files into a new file, by reading
> > one at a time and adding to the new one. The error I received was "Out
of
> > memory during request for 1016 bytes, total sbrk() is 1186891848 bytes!"
> > (I did not add the "!"...). This is on a Sun box with 4 Gig, typically
> > between 1 and 2 Gig free. Is there anything I can do to reduce memory
> > usage? None of my error messages were displayed, so I'm not even sure
> > which statement caused the error (I check for errors on all but the
> > "closedir" and
> > "close(INFILE)"). I should mention that I'm processing 100,000 files of
> > an average of about 6 K each. (Plan B is to reduce the number of files
> > being processed.) Here is the code (stripped of the error checking):
> >
> > opendir(INDIR, $indir);
> > @logfile=grep(/$mask/i, readdir INDIR);
> > closedir(INDIR);
>
> Here the whole INDIR is read at one time.
> I assume that the file still is little enough,
> so the problem must be somewhere else.
>
> > $nbrfiles=@logfile; # number of files matching mask
> > open(OUTFILE, ">$outdir$outfile");
> > for ( $ctr=0; $ctr<$nbrfiles; $ctr++ ) {
> >     open(INFILE, "$indir$logfile[$ctr]");
> >     print OUTFILE <INFILE>;
>
> That's only a shorthand version for
>
> my @temp = (<INFILE>); # reads the whole file at once !!
> print OUTFILE @temp;
>
> >     close(INFILE);
> > }
> > close(OUTFILE);
>
> I would suggest a line by line copying instead:
>
> foreach my lfile (@logfile) {
>    open INFILE, "$indir$lfile" or die "Can't open $lfile: $!";
>    while (<INFILE>) {
>       print OUTFILE;
>    }
>    close INFILE;
> }

Also, I see no reason from this code to read the dir into an array, nor do I

see the need to loop that array.  Why not simply look directly using the 
readdir?  Something like (Not tested):

opendir(INDIR, $indir);
open(OUTFILE, ">$outdir$outfile");
foreach (grep(/$mask/i, readdir INDIR) {
  open(INFILE, "$indir$_");
  print OUTFILE while (<INFILE>);
  close(INFILE);
} 
close(OUTFILE);
closedir(INDIR);


>
>
> Greetings,
> Janek

-- 
Gary Stainburn
 
This email does not contain private or confidential material as it
may be snooped on by interested government parties for unknown
and undisclosed purposes - Regulation of Investigatory Powers Act, 2000     


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to