Hi,

I want to parse large log file (in GBs)

and I am readin 2-3 such files in hash array.

But since it will very big hash array it is going out of memory.

what are the other approach I can take.


Example code:

open ($INFO, '<', $file) or die "Cannot open $file :$!\n";
while (<$INFO>)
{
        (undef, undef, undef, $time, $cli_ip, $ser_ip, undef, $id,
undef) = split('\|');
                push @{$time_table{"$cli_ip|$id"}}, $time;
}
close $INFO;


In above code $file is very big in size(in Gbs); so I am getting out
of memory !


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to