> You should be using something like
> 
> open(FILE, $file) or die "$!\n";
> while(<FILE>){
>       ## do something
>       }
> close FILE;
> __END__

This is what I am doing, but before any of the file is processed, the
whole text file is moved into memory.  The only solution I can think of
is to break apart the text file and read thru each smaller part...but I
would like to avoid this.  I was hoping someone knew how perl interacts
with memory and knew how to trick it into not reading the whole file at
one time.  

My main concern is scalability.  This file will continue to grow daily,
and in a year I don't want my app taking up a gig of mem.

If nothing can be done reading a file, what about piping output from
another problem?  I use artsases (anyone familiar with the arts++
package) to generate the 150meg file, but I could easily pipe the
results to my perl program instead of building the file.  As far as
memory and file handlers are concerned, what differences are there between
reading from a text file vs another program?

Thanks for your help,
Brian


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to