>    That "last;" statement makes certain of it. So, the second
>email should be coming from a second instance of this script.
>You need some mechanism to allow each instance of the script to
>communicate with past instances or you'll need a better test
>than elapsed time.

>HTH,

>Charles K. Clarkson
>--
>Mobile Homes Specialist
>254 968-8328


--
BEGIN DEREK's RESPONSE

yes I know and this is proving to be a mind boggler.  thx...
As Tim stated from a previous email:

* What about using a hash to keep track of which strings you've already
gotten?

* What about keeping the line number and starting your reading at the
next line? (For example, use the $. variable, and then copy it to
$currentLine at the end of the current read, then do a "next if $. <=
$currentLine;" at the beginning?

I was attempting to build a hash of arrays but the data became garbled aka
what I was looking for
was not there.  Using a hash from what Tim said made sense to me as in
keeping track what was already read.
I need to complete this for two different log files so I will start with
the more critical one.
thank you
derek


Here is my code:

#!/usr/bin/perl
require 5.6.1;

#use strict;
use warnings;
use diagnostics;
use File::Copy;
use MIME::Lite ();
use Logfile::Rotate ();
use Readonly;
Readonly::Scalar my $FIVE_MINUTES => '300';
$ENV{"PATH"}    = qq(/asmcat/logs:/usr/bin:/usr/local/log);

my $now         = time();
my $sfsdlog     = qq(/asmcat/logs/samfs_dump.log);
my $fujiO       = qq(/fuji/original);
my $fujiC       = qq(/fuji/clinical);
my $hrtlab      = qq(/heartlab);
my $lv          = qq(/lanvision);
my (%strngs,$s) = ();
my @lines       = ();

sub dateme {
    my ($month,$day,$year) = (localtime)[4,3,5];
    sprintf ("%02d/%02d/%02d", $month+1,$day,$year % 100);
}
my $date = dateme();

open (LOG, "+<$sfsdlog") || die "could not open file: $sfsdlog $!";
    for (;<LOG>;) {
         unless (/samfsdump:/) {
              push @lines, $_; #y<\n>//d;
        }
    }

#print @lines;

    foreach my $i (@lines) {
       push @{ $strngs{$i} };
    }

    foreach $s (keys %strngs) {
       print "[EMAIL PROTECTED] }\n";
    }

My goal is to update a system log and then email when "Errors:" reaches a
value > zero  and only from todays date on.
Here is the output from just the unless construct.  The print
"[EMAIL PROTECTED] }\n"; data is the data at ther very botom but it is
garbled.

 __OUTPUT__

Starting InodeDump for /fuji/clinical : 02/02/06 12:00:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++

samfsdump statistics:
    Files:              885079
    Directories:        1012785
    Symbolic links:     0
    Resource files:     0
    File segments:      0
    File archives:      2534408
    Damaged files:      0
    Files with data:    0
    File  warnings:     545
    Errors:             0
    Unprocessed dirs:   0
    File data bytes:    0

dump Completed: 02/03/06 09:00:00

Starting Compressing and Copying: 02/02/06 09:00:00
++++++++++++++++++++++++++++++++++++++++++++++++
C and C Completed: 02/03/06 09:00:00

All Dumps Completed: 02/03/06 09:00:00
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++

 Starting Inode Dumps: 02/03/06 15:00:00
+++++++++++++++++++++++++++++++++++++++++++++++++++++++

Starting InodeDump for /slh1 : 02/03/06 15:00:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++

dump Completed: 02/03/06 15:00:00

Starting InodeDump for /slh2 : 02/03/06 15:00:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++

dump Completed: 02/03/06 15:00:00

Starting InodeDump for /heartlab : 02/03/06 15:00:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++

samfsdump statistics:
    Files:              1314880
    Directories:        141368
    Symbolic links:     0
    Resource files:     4
    File segments:      0
    File archives:      2628853
    Damaged files:      0
    Files with data:    0
    File  warnings:     442
    Errors:             0
    Unprocessed dirs:   0
    File data bytes:    0

dump Completed: 02/03/06 15:00:00

Starting Compressing and Copying: 02/03/06 15:00:00
++++++++++++++++++++++++++++++++++++++++++++++++
C and C Completed: 02/03/06 15:00:00

Starting InodeDump for /fuji/original : 02/03/06 15:00:00
++++++++++++++++++++++++++++++++++++++++++++++++++++++++


__END_OUTPUT__


__BEGIN_OUTPUT__ from  print "[EMAIL PROTECTED] }\n";

Starting Compressing and Copying: 02/01/06 15:00:00

++++++++++++++++++++++++++++++++++++++++++++++++

    File archives:      2534408

    Directories:        140749

    Files:              1308553

Starting InodeDump for /slh1 : 02/02/06 12:00:00

    Directories:        1017911

Starting InodeDump for /fuji/original : 02/02/06 18:00:00

Starting InodeDump for /heartlab : 02/02/06 00:00:00

Starting Compressing and Copying: 02/03/06 00:00:00

    Resource files:     3

Starting InodeDump for /fuji/clinical : 02/01/06 15:00:00

    Files:              883963

Starting InodeDump for /slh1 : 02/03/06 15:00:00

All Dumps Completed: 02/02/06 15:00:00

 Starting Inode Dumps: 02/01/06 15:00:00

    Directories:        1011988

Starting InodeDump for /heartlab : 02/03/06 03:00:00

    File  warnings:     395

    Files:              891176

Starting Compressing and Copying: 02/02/06 06:00:00

Starting InodeDump for /fuji/clinical : 02/02/06 18:00:00

    File archives:      2556420

    File  warnings:     69

 Starting Inode Dumps: 02/03/06 00:00:00

dump Completed: 02/01/06 15:00:00

All Dumps Completed: 02/03/06 06:00:00

    Directories:        1018045

    File  warnings:     40

 Starting Inode Dumps: 02/02/06 06:00:00

    Errors:             0

dump Completed: 02/03/06 00:00:00

    File  warnings:     272

    Files:              1312341


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to