Andrew, thanks for sharing your scripts, however, I would bet that few list
member will actually see them.

Log entries:
==========
11/07/2004 01:00:46 Qe43e56af00464c1b MIME file: Scripts.zip [base64;
Length=5925 Checksum=655492]
11/07/2004 01:00:46 Qe43e56af00464c1b Banning .ZIP file with cmd extension.
11/07/2004 01:00:47 Qe43e56af00464c1b Scanned: Banned file extension. [MIME:
2 11189]
11/07/2004 01:00:47 Qe43e56af00464c1b From:
[EMAIL PROTECTED] To: [EMAIL PROTECTED]
11/07/2004 01:00:47 Qe43e56af00464c1b Subject: RE: [Declude.JunkMail] LOG
Levels
==========

I just happened to retrieve the Q&D files from of my virus folder so I could
view the message.  For future reference, it's best to change the extension
of .cmd files to .txt for delivery, with a note to recipients to change the
extension back to .cmd once they have received the message.

Bill

----- Original Message ----- 
From: "Colbeck, Andrew" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Sunday, November 07, 2004 12:55 AM
Subject: RE: [Declude.JunkMail] LOG Levels


> Hey, fun-seekers, I was feeling left out.
>
> Necessity being the mother of invention, I cobbled a bunch of scripts
> together that I find useful.  I just extended one a bit to do what Serge
was
> looking for.
>
> I make good use of the GNU Utilities that Bill has advised us on.  Thanks,
> Bill!
>
> Often, I just care about the weight lines, or the from lines, or the
subject
> lines, so I've got 3 scripts that pull just those lines out into
weight.txt,
> from.txt and subject.txt, and just because, another one called build3.cmd
> that builds all three of those files.  The count is output; the
discrepancy
> between the line counts is based on the repetition of lines in the log
when
> there are multiple recipients.
>
> There's a 4th script that I don't use much, called Action, that does a
count
> of the actions I care about.  I'm including a script that Bill put forward
> here, called MessagesPerHour that does what you'd expect.  I use it for
> those "are we getting a lot of mail" questions.
>
> I found that for "Help Desk calls", it was usually a matter of finding:
>
> "User X reports that they don't get email from [garbled name]"
>
> or
>
> "Company X reports that some of their mail doesn't get to our users"
>
> So I took the next step and wrote: ShowFrom and ShowTo.  They do what
you'd
> expect; they filter the From: lines, but these scripts go the next step as
> well and show you the Last Action for each of those messages too, and put
> that action early in the columns so that they're easy to spot.
>
> For Serge, I added: ShowAll, which will take some snippet of a Declude
log,
> and based on the Qxxxx column, will find all other lines in a different
file
> (presumably the full decMMDD.log).
>
> Saving the output of a ShowTo and using it as input to ShowAll would be
> quite useful.
>
> Likewise, for work on new or old tests, I have ShowWeight.  It outputs the
> Total Weight lines, where they include a certain test like SORBS.  Because
I
> take the command line as input for the gnutils, it's regexp friendly (YMMV
> ... I always use capitals).  You can add an extra parameter to this one
that
> specifies the action, which lets you, say, find all lines that matched
> SPAMCOP for which the action was IGNORE.
>
> I also use 2 little batch files that call textpad (my preferred text
editor)
> with a D*.SMD value, and copy the ?*.SMD files from the spam folder back
to
> the queue.  They work for me because I simply mouse the "*" part right off
> the screen of my command line session.  I tell myself that I'll get around
> to parsing the input, and taking the right action if a whole Qxxxxx is
> passed instead of the xxxx part... they're called T and Q.
>
> Lastly, I should mention that I find it too slow to work on the files at
the
> server, and too slow to work on them over a file share, so I pull them
over
> to a temp folder on my desktop with RoboCopy from the Microsoft Windows
> Server Resource Kit.  So I've got two scripts that parse the date and pull
> down the correct decMMDD.log (or sysMMDD.txt) for today, and another for
> yesterday.  They're called Today and Yesterday :)
>
> Enjoy!
>
> Andrew 8)
>
> -----Original Message-----
> From: Bill Landry [mailto:[EMAIL PROTECTED]
> Sent: Saturday, November 06, 2004 3:27 PM
> To: [EMAIL PROTECTED]
> Subject: Re: [Declude.JunkMail] LOG Levels
>
>
> ----- Original Message ----- 
> From: "Serge" <[EMAIL PROTECTED]>
>
> > Sorry, i may not expressed myself
> >
> > I need to
> > grep %variable% ...
> >
> > Where the variable takes all the values generated by the first grep:
> > grep "MAIL FROM:[EMAIL PROTECTED]" D:\log1104.txt | gawk "{print $5}"
> > |
> uniq
> >
> > Should i use some kind of
> > FOR instruction in a Windows batch file ?
> > Or is there a way to achieve that in unix util ?
> >
> > Suppose the first grep gives
> > (71c80106004a8af1)
> > (7202010b004a8b02)
> > (7206010d004a8b05)
> > (72b70136004a8b35)
> > (72f300fb004c8b48)
> > (732f015e067a8b5a)
> > (736c00f5002a8b6e)
> > (74d201f4069c8bbc)
> > (7587038a063c8beb)
> > (758b0181067a8bed)
> >
> > How do I automate "grepping" all the lines for the above sessions from
> > the log files ? (without manually running a grep for each one)
>
> Oops, disregard my last post, accidentally included some of my own path
info
> in the post.  Instead:
>
> grep "MAIL FROM:[EMAIL PROTECTED]" D:\log1104.txt | gawk "{print $5}" |
cut
> -b 6- | uniq > temp.txt grep -f temp.txt D:\log1104.txt > results.txt
>
> Bill
>
> ---
> [This E-mail was scanned for viruses by Declude Virus
> (http://www.declude.com)]
>
> ---
> This E-mail came from the Declude.JunkMail mailing list.  To unsubscribe,
> just send an E-mail to [EMAIL PROTECTED], and type "unsubscribe
> Declude.JunkMail".  The archives can be found at
> http://www.mail-archive.com.
>
>
>

---
[This E-mail was scanned for viruses by Declude Virus (http://www.declude.com)]

---
This E-mail came from the Declude.JunkMail mailing list.  To
unsubscribe, just send an E-mail to [EMAIL PROTECTED], and
type "unsubscribe Declude.JunkMail".  The archives can be found
at http://www.mail-archive.com.

Reply via email to