memory issues reading large files

2002-02-07 Thread Brian Hayes

Hello all.  I need to read through a large (150 MB) text file line by
line.  Does anyone know how to do this without my process swelling to
300 megs?
 
I have not been following the list, so sorry if this question has
recently come up.  I did not find it answered in the archives.
 
Thanks,
Brian 

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: memory issues reading large files

2002-02-07 Thread Brian Hayes

> You should be using something like
> 
> open(FILE, $file) or die "$!\n";
> while(){
>   ## do something
>   }
> close FILE;
> __END__

This is what I am doing, but before any of the file is processed, the
whole text file is moved into memory.  The only solution I can think of
is to break apart the text file and read thru each smaller part...but I
would like to avoid this.  I was hoping someone knew how perl interacts
with memory and knew how to trick it into not reading the whole file at
one time.  

My main concern is scalability.  This file will continue to grow daily,
and in a year I don't want my app taking up a gig of mem.

If nothing can be done reading a file, what about piping output from
another problem?  I use artsases (anyone familiar with the arts++
package) to generate the 150meg file, but I could easily pipe the
results to my perl program instead of building the file.  As far as
memory and file handlers are concerned, what differences are there between
reading from a text file vs another program?

Thanks for your help,
Brian


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: memory issues reading large files

2002-02-07 Thread Brian Hayes

It appears the problem was using the foreach statement instead of while.
I have not tested this extensively, but using foreach the whole text
file (or output of pipe) is read into memory before continuing, but
using while (and probably for) each line is processed as it is read.  

Thanks for all your help,
Brian

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]