On Apr 8, 2:01 pm, "Deryck Hodge" <[EMAIL PROTECTED]> wrote:
> On 4/8/07, Vinay Sajip <[EMAIL PROTECTED]> wrote:
>
> I have a logger setup for use at work, and I control it via a
> LOG_LEVEL setting in settings.py.  I don't have logging statements
> throughout my code, only in things like the signal receiver that sends
> an email alert when a new object is created, so I don't notice any
> performance hit.  This is a subjective feeling, though, and not
> benchmarked.  I mainly use logging just to track that certain code
> paths are doing what I expect when run normally.
>
> I would think you would want it configurable, however, not automatic
> on debug.  We run development in DEBUG, which I assume is common, and
> that could generate noisy logs.  And what if I just wanted critical
> notices?  Changing a log level setting seems the right way to do this.

You can do this in settings.py - configure the root logger with a
level of logging.DEBUG (in a dev/test environment) and e.g.
logging.CRITICAL (in a production environment). If you don't do any
other configuration, all loggers will use the root logger's level by
default. If you want to turn on verbosity on specific loggers only,
even after going into production, you can change the settings to e.g.

logging.getLogger("my.app.module").setLevel(logging.DEBUG)

while leaving the root logger's level at logging.CRITICAL, say...then
you will turn up verbosity for that part of the app only, without
affecting other parts you're not interested in at the moment. And
because of the hierarchical setup, if you turn up the verbosity on
"my.app.module", you also do so on "my.app.module.submodule1",
"my.app.module.submodule2" etc.

> However, I know from talking to Adrian that he's not warm to the idea
> of logging in Django itself. :-)  And for understandable reasons.  It
> would be interesting to actually test the impact of a logger, though,
> just to know for sure.

Well, fair enough. I'm not sure what Adrian's reasons are, but I
suspect performance. Logging will obviously involve *some* overhead,
so there's always a trade-off to be made. As an initial exercise, I
added some very simple logging for SQL in django/db/backends/util.py,
which just logs the SQL and parameters using logger.debug(). In
settings.py, I configured logging. I ran an application from the shell
in three specific scenarios:

(a) Without the logging code in utils.py
(b) With the logging code in utils.py, and logging configured to use a
level of logging.DEBUG (so SQL would actually be logged)
(c) With the logging code in utils.py, and logging configured to use a
level of logging.INFO (so SQL would not be logged)

For each run, the application (migrating some legacy HR data into a
test sqlite database) saved around 4500 employee, 650 org unit and
5700 position model instances, twice over in each run. Now the
following is hardly a rigorous study (!), but it gives a rough idea:

No logging code in utils.py
===========================
Save employees #1: 79.3 secs
Save org units #1: 11.1 secs
Save positions #1: 104.4 secs
Save employees #2: 82.0 secs
Save org units #2: 12.1 secs
Save positions #2: 108.9 secs

Logging code in utils.py, logging configuration set to INFO (no actual
logging output produced)
===============================================================================================
Save employees #1: 78.0 secs
Save org units #1: 11.1 secs
Save positions #1: 101.6 secs
Save employees #2: 82.5 secs
Save org units #2: 11.5 secs
Save positions #2: 112.3 secs

Logging code in utils.py, logging configuration set to INFO (logging
output produced)
=====================================================================================
Save employees #1: 88.6 secs
Save org units #1: 11.5 secs
Save positions #1: 106.6 secs
Save employees #2: 82.1 secs
Save org units #2: 11.6 secs
Save positions #2: 120.6 secs

This is really only anecdotal, of course - your mileage will no doubt
vary. Here's the code from settings.py:

import logging
logging.basicConfig(level=logging.DEBUG, filename='/tmp/mysite.log',
filemode='w', format="%(levelname)-8s %(name)-20s %(message)s")

and here's the logging code in utils.py:

=== modified file 'django/db/backends/util.py'
--- django/db/backends/util.py
+++ django/db/backends/util.py
@@ -1,5 +1,8 @@
 import datetime
 from time import time
+import logging
+
+logger = logging.getLogger('django.db.sql')

 class CursorDebugWrapper(object):
     def __init__(self, cursor, db):
@@ -9,6 +12,8 @@
     def execute(self, sql, params=()):
         start = time()
         try:
+            if logger.isEnabledFor(logging.DEBUG):
+                logger.debug('%s %% %s', sql, params)
             return self.cursor.execute(sql, params)
         finally:
             stop = time()

Best regards,


Vinay Sajip


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to