Shawn H Corey <shawnhco...@gmail.com> writes: > On Wed, 28 Aug 2013 10:42:30 -0400 > Harry Putnam <rea...@newsguy.com> wrote: > >> Thanks to all other posters.. lots of good input. > > It seems to me that recording the same information is many places is a > design flaw. If you have the same information in two or more places, it > will get out of sync. Write the program that will detect and correct > this, now, not later.
Good thinking thanks. It might not really apply here though. I'm no kind of data manager... just a homeboy hillbilly. What I had in mind is writing to a single log file that is dated on the file name for each run of the program. That file will sooner or later be deleted by another part of the script that will try to leave 5 of the most recent logs. I was going to just cat the fresh log, onto an accumlative log after each run so the info would stick around for a while longer in case some kind of problem came up requiring research into previous runs older than the 5 dated logs. So the info is really only duplicated for a few weeks. That accumulative file might grow quite a bit. I find it handier to look thru a few dated logs that are much smaller ... but if it comes down to it and the dated logs don't go back far enough then I can find it in the accumulative log. That too, will get trimmed back eventually, That sounds quite a bit like what cron could do with this.... hmmmmmm. -- To unsubscribe, e-mail: beginners-unsubscr...@perl.org For additional commands, e-mail: beginners-h...@perl.org http://learn.perl.org/