If you are reading the entire file and stuffing it into a hash (or
other data structure, even after transformations), it still consumes
memory proportional to the file size (or at least to the final size of
$key and $value in your example). When Michael said read one line at
a time, I think he meant also to process each line at a time, giving
the opportunity to Perl let go the bits that it don't need anymore.
For example,
my ($key, $value);
while (FILE) {
($key, $value) = $_ =~ /(\w+): (.*)/;
print key: $key, value: $value\n
};
does not have such problem, because as soon the print is done, it is
time to replace the objects in $_, $key, $value by others, with few
objects at memory at all times. The same with:
my $maxkey = '';
my $sum = 0;
while (FILE) {
my ($key, $value) = $_ =~ /(\w+): (.*)/;
$sum += $value;
$maxkey = $key if $maxkey $key
};
because the information is being aggregated into few variables (in
this case, $maxkey and $sum). (Of course, the two examples above may
not resemble a real-world application, but they illustrate the issue.)
If your algorithm needs the $hash constructed by
$hash {$key} = $value;
to hold considerable amounts of the info provided in the original
file, you need memory anyway. Some simple algorithms don't scale well,
which is common when the input data grows at a fast pace. Maybe you
should consider alternative approaches to break your calculations into
manageable bits that fit the computer memory. If it runs in another OS
version, differences may be blamed to different memory comsuption
between these versions, differences between the compiled Perl
interpreters, and whatever else. Try to look to the patterns of memory
requirements of the Perl script at the two machines. ETL scripts can
be hogs and database issues can also come into scene.
Regards,
Adriano.
On 7/8/05, Eduardo Mattos Ramos Murad [EMAIL PROTECTED] wrote:
Michael,
I´m not attempting to read in the entire file
In an older version of my AIX O.S. this message Out Of Memory wasn´t
ocurring, with the same machine, with the same program
It happens when:
open (FILE, file.txt);
while (FILE) {
#
$hash {$key} = $value;
};
close FILE;
De: Michael G Schwern via RT [mailto:[EMAIL PROTECTED]
[EMAIL PROTECTED] - Fri Mar 19 05:13:55 2004]:
Everytime I try to work with Large Files I get the message Out Of
Memory, is
Are you, perhaps, attempting to read in the entire file into memory at
once? Something like: