Hello all. I need to read through a large (150 MB) text file line by
line. Does anyone know how to do this without my process swelling to
300 megs?
I have not been following the list, so sorry if this question has
recently come up. I did not find it answered in the archives.
Thanks,
Brian
> You should be using something like
>
> open(FILE, $file) or die "$!\n";
> while(){
> ## do something
> }
> close FILE;
> __END__
This is what I am doing, but before any of the file is processed, the
whole text file is moved into memory. The only solution I can think of
is to break
It appears the problem was using the foreach statement instead of while.
I have not tested this extensively, but using foreach the whole text
file (or output of pipe) is read into memory before continuing, but
using while (and probably for) each line is processed as it is read.
Thanks for all y