Le 07/04/2012 16:47, Vinay Sajip a écrit :
Thibaut<merwin.irc<at> gmail.com> writes:
Ok, I understand what happenned. In fact, configuring the logging before
forking works fine. Subprocess inherits the configuration, as I thought.
The problem was that I didn't passed any handler to the QueueListener
constructor. The when the listener recieved an message, it wasn't handled.
I'm not sure how the logging module works, but what handlers should I
pass the QueueListener constructor ? I mean, maybe that I would like
some messages (depending of the logger) to be logged to a file, while
some others message would just be printed to stdout.
This doesn't seem to be doable with a QueueListener. Maybe I should
implement my own system, and pass a little more informations with the
record sent in the queue : the logger name for example.
Then, in the main process I would do a logging.getLogger(loggername) and
log the record using this logger (by the way it was configured).
What do you think ?
You probably need different logging configurations in different processes. In
your multiprocessing application, nominate one of the processes as a logging
listener. It should initialize a QueueListener subclass which you write. All
other processes should just configure a QueueHandler, which uses the same queue
as the QueueListener.
All the processes with QueueHandlers just send their records to the queue. The
process with the QueueListener picks these up and handles them by calling the
QueueListener's handle() method.
The default implementation of QueueListener.handle() is:
def handle(self, record):
record = self.prepare(record)
for handler in self.handlers:
handler.handle(record)
where self.handlers is just the handlers you passed to the QueueListener
constructor. However, if you want a very flexible configuration where different
loggers have different handlers, this is easy to arrange. Just configure logging
in the listener process however you want, and then, in your QueueListener
subclass, do something like this:
class MyQueueListener(logging.handlers.QueueListener):
def handle(self, record):
record = self.prepare(record)
logger = logging.getLogger(record.name)
logger.handle(record)
This will pass the events to whatever handlers are configured for a particular
logger.
I will try to update the Cookbook in the logging docs with this approach, and a
working script.
Background information is available here: [1][2]
Regards,
Vinay Sajip
[1]
http://plumberjack.blogspot.co.uk/2010/09/using-logging-with-multiprocessing.html
[2]
http://plumberjack.blogspot.co.uk/2010/09/improved-queuehandler-queuelistener.html
This is exactly what I wanted, it seems perfect. However I still have a
question, from what I understood,
I have to configure logging AFTER creating the process, to avoid
children process to inherits the logging config.
Unless there is a way to "clean" logging configuration in children
processes, so they only have one handler : the QueueHandler.
I looked at the logging code and it doesn't seems to have an easy way to
do this. The problem of configuring the logging
after the process creation is that... I can't log during process
creation. But if it's too complicated, I will just do this.
Thanks again for your help Vinay,
--
http://mail.python.org/mailman/listinfo/python-list