The files I am dealing with can be upwards of 42,000 MB and larger. I
tried fancier ways of processing, with arrays and such, and blew out my
memory every time.
I finally had to re-write my entire script to process line by line,
holding almost nothing in memory. It is not pretty and not fancy,
[EMAIL PROTECTED] wrote:
Can anyone seem to think of a less memory intensive way of doing this
(besides chopping the file up into smaller chunks -- which is my next
step unless I receive a better option?
The txt file is 335MB.
The column and row delimiter can change, but needs to be an
I don't see a benefit in reading the whole file into memory. Try this:
my $del=qr{||~};
binmode(STDIN);
s{$del}{}g,map($c[ord($_)]++,split(//,$_)) while();
foreach(0..255) {
printf %3d = %d\n, $_, defined($c[$_])?$c[$_]:0;
}
Tobias
--- Senior Programmer Analyst --- WGO
[mailto:[EMAIL PROTECTED]
Sent: Tuesday, June 01, 2004 11:21 AM
To: Steven Manross; [EMAIL PROTECTED]
Subject: RE: parsing large file for character usage
[EMAIL PROTECTED] wrote:
Can anyone seem to think of a less memory intensive way of doing this
(besides