A lot depends on the contents of "some task;".   If the task does not leave
stuff in memory for each iteration, then it is just taking a long time.  If
it does leave stuff in memory, then you could easily run out of memory.

Also, a lot depends on the OS.  For instance, the defaults for filesystems
in AIX is (or at least used to be) to have a read-ahead buffer. Putting
stuff in that buffer trumped everything else, so just doing a simple wc -l
bigfile could fill memory and make it hard to ssh into the box (that was a
lot of fun to track down).

A good practice when working with really large things is to provide some
form of feedback to the user that things aren't completely hung.  Try this:

my $i = 0;
while (<$FILE>) {
 chomp;
 some task;
 ...
} continue {
    if ((++$i % 10_000) == 0) {
        print "handled $i records\n";
    }
}

This will print out a message every 10,000 lines.  If it does blow up, at
least you will know about how many records were processed and may be able
to either split the file or restart at a given position (see the pos and
seek functions).

On Thu, May 24, 2018 at 11:00 AM SATYABRATA KARAN <
satyabratakara...@gmail.com> wrote:

> Hello,
> Does anyone guide me how to handle a huge file (>50GB) in perl?
>
> I tried open method this way, but it's hanging the terminal. I think that
> is out of memory error internally.
>
> open my $FILE, "<", "Log_file.txt";
>
> while (<$FILE>) {
>  chomp;
>  some task;
>  ...
> }
>
> close $FILE;
>
>
> Thanks,
> Satya
>

Reply via email to