<[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> thanks for the reply,
> I have used another method to solve my problem. ie
> 1) get the total count of the first file
> 2) write this total count to basecnt eg basecnt
> 3) get another file, get the total count of this file. eg filecnt
> 4) if filecnt > basecnt, read in the values from file[basecnt:filecnt]
> 5) if filecnt < basecnt, overwrite original basecnt and start over
> again.
>
> basically, the problem domain is i want to get the most current records
> from a log file to review after every 3 hours. so this log file will
> increase or accumulate.
>

I did this:

            fp = os.popen('/usr/sbin/logtail /var/log/syslog')
            loglines = fp.readlines()

            .... pyparsing ... stuff .... from loglines
;-)

Python is maybe overkill too - have "cron" call "logtail" and pibe the
output whereever?

PS:

"logtail" is very simple, it works simply by maintaining a "bookmark" from
the last read that is updated after each time the file is read (i.e. on each
call). It is probably a very easy thing to implement in Python. On
Linux/UNIX syslog+logutils can do a lot of work just by configuration (but
you did not say you are on unix)


-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to