New submission from William Hart <whart...@gmail.com>:

I recently started using logging extensively in the Coopr software, and I ran 
into some performance issues when logging was used within frequently used 
kernels in the code.  After profiling, it became clear that the performance of 
the logging package could be improved by simply caching the value of the 
Logger.isEnabledFor() method.

I've created a draft version of this cachine mechanism based on a snapshot of 
logging that I took from Python 2.7.1.  This is currently hosted in 
pytuilib.logging, though I'd love to see this migrate into the Python library 
(see 
https://software.sandia.gov/trac/pyutilib/browser/pyutilib.logging/trunk/pyutilib/logging).

Basically, I did the following:

1. Added a counter to the Manager class (status) that is incremented whenever 
the manager object has its level set

2. Add a counter to the Logger class (level_status) that represents the value 
of the manger status when the Logger's cache was last updated

3. Rework the isEnabledFor() method to update the cache if the logger status is 
older than the manager status.  I moved the present isEnabledFor logic into the 
_isEnabledFor() method for simplicity.

The attached file shows the diffs.  Note that there were a few other diffs due 
to an effort to make pyutilib.logging work on Python 2.5-2.7.

--Bill

----------
components: Library (Lib)
files: logging__init__diffs.txt
messages: 129851
nosy: William.Hart
priority: normal
severity: normal
status: open
title: Add caching for the isEnabledFor() computation
versions: Python 2.7, Python 3.1, Python 3.2, Python 3.3
Added file: http://bugs.python.org/file20967/logging__init__diffs.txt

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue11369>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to