Unfortunately everyone wants to use email for everything.....the proper way would
be to use some facilities that have proven track record, like syslogd(1d) on each
box. As you know syslogd() allows you to route a message to a remote box. So,
you assign one of your boxes as the final or central point. All messages would be
routed to this box automatically for you. Here you can have a process, person or
a file to receive such cumalated messages. Here you have the opportunity to
visit last 100 message and look for corrolations.

The point is, process these reports first, prioritize and then inform.

By the way, your exact problem has been solved by syslogd()....
Try the following on your unix box.....

Find an entry in your /etc/syslog.conf that writes to /var/log/message or somel file.
I will use cron.none , then do

logger -p cron.none -t Test "this is a test"
logger -p cron.none -t Test "this is a test"
logger -p cron.none -t Test "this is a test"
etc, etc

You'll see that syslog will say
last message repeated 4 times

Cheers....

Benjamin Elbirt wrote:

Wow,

I never expected the response I got!  Well, lets assume that I were to go with
the shared memory option anyway... what would the pitfalls be / concerns?  The
truth is, I don't want a separate system (as per the e-mail about having an
error handling server), and I don't want to have to manage the e-mail on the
receiving end because I'm not the only person who receives it (didn't mention
it, but I guess that's important).  Further, I have no control over the mail
server that handles the incoming mail so I'd have to handle it on the mail
client (Outlook / Netscape Mail) resulting in the same problem I have now.

Thanks,

Ben

Perrin Harkins wrote:

> Andrew Ho wrote:
> > Your error handlers on your five load-balanced boxes send an HTTP request
> > to this error handling box.
>
> That sounds kind of slow, since it requires a network connection to be
> created every time and some synchronous processing on the other end.  It
> also depends on that box always staying up.  I think e-mail is actually
> a good approach, since it's a robust message queuing system and if you
> use something like qmail-inject to send the e-mail it takes almost no
> time at all for mod_perl to finish with it and move on.  You just need
> to process those messages on the other end instead of looking at the raw
> output, i.e. use Mail::Audit to keep track of the current state and
> remove duplicate messages.
>
> Matt posted something about PPerl yesterday, which could make a
> Mail::Audit script more efficient by keeping it persistent.
>
> - Perrin

-- 
-------------------------------------------------------------------------
Medi Montaseri                               [EMAIL PROTECTED]
Unix Distributed Systems Engineer            HTTP://www.CyberShell.com
CyberShell Engineering
-------------------------------------------------------------------------
 


Reply via email to