I'm trying to optimize a script used for processing large text log files
(around 45MB).  I think I've got all the processing fairly well optimized,
but I'm wondering if there's anything I can do to speed up the initial
loading of the file.

Currently, I'm performing operations on the file one line at a time, using a
"while (<FILE>)" loop.  I'm pushing important lines into an array, so
further processing is done in memory, but the initial pass on the file is
rather time consuming.  Is there a more efficient way to work with large
text files?

Thanks.

Jason

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>

Reply via email to