On Tue, Jan 22, 2008 at 05:52:35PM -0500, Chas. Owens wrote: > On Jan 22, 2008 2:58 PM, lerameur <[EMAIL PROTECTED]> wrote: > > Hello, > > > > I wrote a short perl script (65 lines), simply to count some log file > > time, and take the average. The script takes 4 minutes to run and go > > through about 8 millions lines. > > I would like to know if I can make it run faster. Why?, if I use the > > command 'wc -l filename' , I get the number of lines in about a > > minute, that is three less then the small script. I am right by > > thinking the script can be reprogrammed so it can be process the file > > faster ??? > snip > > Since we can't see your code, we can't tell if it can be done faster. > Please note that wc -l is doing something that in Perl could be > accomplished in 1 line: > > perl -nle 'END { print $c } $c++' file > > or the slightly more efficient > > perl -ne 'BEGIN { $/ = \4196 } END { print "$c\n" } $c += tr/\n//' file > > so 65 lines is a significantly more complex program and trying to > compare them is not a good idea. You should also make sure that you > are not seeing the effects of caching when comparing the two programs.
Yes, give wc something to do and it will probably take a little longer. For example, on my machine here wc -w big_file takes about twice as long as perl -nle '$w++ while /\S+/g; END { print $w }' big_file which just goes to prove ... er ... hmmm ... well, not very much, really. -- Paul Johnson - [EMAIL PROTECTED] http://www.pjcj.net -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/