"John Purser" <[EMAIL PROTECTED]> wrote in message > I'm writing a system admin script in python that checks recently > accessed files for keywords like "failed, denied, error,..." etc. I'm > using popen to call grep -F <LIST> <LOG_FILE> but it's VERY slow. Can > anyone suggest a faster method to do this?
are you trying to locate the actual strings in the files or only testing to see if the strings occur? If the latter then you could use something like files = [f for f in glob(pattern) if mystring in open(f).read()] or for multiple patterns construct a single regex: files = [f for f in glob(pattern) if regex.search(open(f).read())] That might be slightly faster than spawning sub processes with popen. But it won't be a radical difference and if you actually want the line numbers of the patterns I suspect grep will be at least as fast as using python. HTH, -- Alan G Author of the learn to program web tutor http://www.freenetpages.co.uk/hp/alan.gauld _______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor