I have this code that looks through a series of files, and for each, counts the unique number of IP addresses over the past 20 minutes, then rewrites the file contents.
It seems pretty simple - each of the 5 $site files has at most 100 or so IP entries it in - many files have 0-5 rows of data at any time. So all in all, the 5 files are small. Yet, as I monitor my server CPU and memory usage, top shows this as a #1 offender often. It runs out of cron every 10 minutes all day long. Can anyone help me determine why this is so bad in terms of CPU and memory according to "top"? Thanks! #!/usr/local/bin/perl my $DEBUG=0; my $HOME=qq(/home/site); use Date::Manip; use CGI; my $cgi = new CGI; my $then=&ParseDate("20 minutes ago"); # Get list of all sites for which IP data is being logged my @todo=`ls $HOME/www/cgi-bin/js-ip.*`; while (my $site=shift(@todo)) { chomp $site; $site=~s/.*js-ip.//g; my $ipfile=qq($HOME/www/cgi-bin/js-ip.$site); open (JF, "$ipfile"); my @data=<JF>; close JF; my %haveip=(); my $new=""; my $online=0; # Look at the IP data and count the number of unique IPs # Save only those which have appeared in the past 20 minutes while (my $row=shift(@data)) { chomp $row; my ($date, $ip)=split("-", $row); my $date1=&ParseDate($date); my $cmp=&Date_Cmp($date1, $then); if ($cmp>-1 && ! $haveip{$ip}) { $new.=qq($date-$ip\n); $haveip{$ip}=1; $online++; } } # The array is empty already, but what the hey @data=(); # Write over the file with the latest IP data open (JF1, ">$ipfile"); print JF1 qq($new); close JF1; # Write the online count for this site to file for display open (JC, ">$HOME/www/cgi-bin/online-now.$site"); print JC qq($online); close JC; } -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/