Bertrand Baesjou wrote:
Hi,

I am trying to read data from a file, I do this by using the "while (<FILE>){ $line....}" construction.
>
However with files with a size of roughly bigger than 430MB it seems to crash the script :S Syntax seems all fine (perl -wc -> syntax OK).

How does your script crash? What are the symptoms?

I was thinking that maybe it was running to the end of a 32 bit counter (but that would be 536 MB right?)? Can anybody offer an other solution to work with such large files and perl?

People have read files of several gigabytes with Perl. The problem is more
likely to lie with what you do with the data once you have read it. To prove
this for yourself, set this code against the same file:

my $lines = 0;
while (<FILE>) {
  ++$lines;
}
print $lines;

and I am pretty sure that won't crash.

Then try to simplify your code by removing stuff from the loop until the problem
goes away. The last thing you removed contains the cause of the crash.

If you need to post again please give us comprehensive details of the crash.

HTH,

Rob

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to