This worked (although I switched the order of the "select(STDOUT);" and
"select(STDERR);" statements so that STDOUT would be select()'ed at the
end) -- now I get stdout and stderr statements in the right order.
I'm still trying to solve a problem though: I want the jobs to generate
output under normal functioning, so I can trace through and see what they
did. *And* I want to be alerted if one of them produces some output send
to stderr, so I can see what went wrong. *And* when that happens, I want
the stdout and stderr output to be in the correct order, so I can figure
out when the error occurred.
None of the following methods achieve that:
1) Run the jobs with stdout sent to one file and stderr sent to another
file. That way, I can easily see which jobs produced error output, and
even when they don't, I'll have the stdout output if I want to look at
it. The problem is that there's no way to tell in which order the program
printed the statements sent to stdout and stderr. Even if the output to
stdout were printed with a timestamp before each line, the output to stderr
would not be, so there would be no way to merge the output in the two files
into one file where everything was listed in the order in which it
happened.
2) (What I'm doing now) -- sent stdout and stderr both to the terminal, so
that when they're run as scheduled jobs, the output gets emailed to
me. The problem here is, I want to distinguish between jobs that generated
error output, and jobs that didn't -- without reading through each email to
see if any error output is shown. Right now I have my email filter set to
detect messages containing the text "Use of uninitialized", since almost
all of my warning messages contain that text, but that's not an ideal
solution; I want to catch all kinds of messages sent to stderr, not just
"Use of uninitialized variable" warnings.
3) Run the jobs with stdout and stderr directed into the same log file --
has the same problem as (2), no easy way to detect jobs that generated
error output.
Any suggestions? Can STDERR be redirected to two different places at the
same time (without piping it through a script that prints input to two
separate outputs)? In that case, I could have both STDOUT and STDERR
printed to one log file (where everything is stored in the correct order),
and have STDERR also sent to a separate log file where I get alerted if any
errors or stored -- or just have STDERR sent to the terminal so it gets
emailed to me. But is there anything simpler, any standard solution to
this kind of problem?
-Bennett
At 11:43 AM 5/18/2001 +0100, Martin Moss wrote:
>Hi,
>
>It looks like you've not selected unbuffered output on your STDERR &
>STDOUT
>filehandles. Try this:-
>
>select(STDOUT);
>$| = 1;
>select(STDERR);
>$| = 1;
>
>I would suggest that as a rule of thumb you shouldn't write ANY cron tasks
>which print output to STDOUT. If you are using those email messages sent
>to
>root, perhaps you could have a success/failure message printed to STDOUT
>when the script has completed. I always find it more conducive to redirect
>STDOUT & STDERR to a log file. Then you can keep all your script's logging
>information in one place.
>
>Regards
>
>Marty
>
> > -----Original Message-----
> > From: [EMAIL PROTECTED]
> > [mailto:[EMAIL PROTECTED]]On Behalf Of
> > Bennett Haselton
> > Sent: Friday 18 May 2001 11:25
> > To: [EMAIL PROTECTED]
> > Subject: perl output from cron jobs mixing stdout and stderr
> >
> >
> > [slightly OT; more UNIX than Windows; send flames by postcard to 1600
> > Pennsylvania Ave, Washington, D.C.]
> >
> > I think this is probably an issue with cron jobs in general and not
> just
> > perl scripts that are run as cron jobs, but --
> >
> > I have some perl scripts that run as cron jobs and usually generate a
> lot
> > of output sent to stdout, plus some error output sent to stderr. The
> > output gets stored as an email message in the mail account file
> > of the user
> > that owns the job.
> >
> > Except that the output almost always gets garbled when this
> > happens -- the
> > error output is stuck somewhere in the middle of all the stuff that got
> > sent to standard out, usually overwriting some characters. e.g.
> > I finally
> > found the error text inserted in a long block of standard output:
> >
> > >>>
> > [...]
> > copying down http://www.backcountrystore.com/backpack_mystery.htm
> > Length of page string being searched: 47703
> > Found match after downloading page
> > copying down http://www./+U2/: ?+*{} follows nothing in regexp at
> > /home/bhaselto/web/html/queryparse-simplified.pl line 477.
> > gillenpestcontrol.com/
> > Length of page string being searched: 1482
> > [etc.]
> > >>>
> >
> > where "/+U2/: ?+*{} follows nothing in regexp at
> > /home/bhaselto/web/html/queryparse-simplified.pl line 477." is the
> error
> > text.
> >
> > When I run smaller test cron jobs that print stuff to stderr and
> stdout,
> > the stuff printed to stderr always comes first in the email, followed
> by
> > the stuff sent to stdout (regardless of whether the stderr or
> > stdout stuff
> > is printed first in the code). Any idea how come, for large jobs, the
> > stderr stuff gets stuck in the middle of all the stdout stuff, and how
> to
> > fix it?
> >
> > -Bennett
> >
> > [EMAIL PROTECTED] http://www.peacefire.org
> > (425) 649 9024
> >
> > _______________________________________________
> > Perl-Win32-Users mailing list
> > [EMAIL PROTECTED]
> > http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users
[EMAIL PROTECTED] http://www.peacefire.org
(425) 649 9024
_______________________________________________
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users