Re: Python logging question

2009-01-02 Thread koranthala
On Jan 2, 6:21 pm, Vinay Sajip  wrote:
> On Jan 2, 11:31 am, koranth...@gmail.com wrote:
>
> >     I am confused reading both together. I will try to explain my
> > confusion with an example:
>
> > basicLogger =logging.getLogger("basic")
>
> > Class A():
> >   def __init__(self):
> >      self.logger =logging.getLogger("basic.class_a")
>
> >    Now, I make say 10 instances of A and then delete one by one.
>
> > My understanding was that since the same name is used, a single
> > basic.class_a logger object is created inside theloggingsystem, and
> > any calls to getLogger("basic.class_a") would return the same object
> > everytime.
>
> That is correct. The logger instances stay around for the life of the
> process, and are not garbage collected.
>
> > So, my confusion is based on the second tutorial item I mentioned -
> > why is it not a good idea to create logger instances on a per-instance
> > basis? We are not creating new instances, right? And, if I create an
> > instance of A, it will be garbage collected later, right?
>
> It's not a problem to create loggers per *class*, as in your example.
> It can be a bad idea to create different logger per class *instances*.
> The second example in the docs talks about creating loggers on a per-
> connection basis in a networked app. This is not per connection class,
> mind you, but per connection instance. You would typically have only a
> few dozen classes, but you might have hundreds of thousands of
> connection instances created in a long-lived server app. If you
> created a unique logger for each connection, for example based on the
> time the connection was instantiated - e.g. with name "connection.
> 20090102123456543", this would create hundreds of thousands of unique
> logger instances and have a potentially adverse impact on process
> memory. That's when you use LoggerAdapters.
>
> I hope that's clearer.
>
> Regards,
>
> Vinay Sajip

Thank you very much Vinay.
I was confused by the way it was mentioned in the tutorial.
Again, Thank you
Regards
Koranthala
--
http://mail.python.org/mailman/listinfo/python-list


Re: Python logging question

2009-01-02 Thread Vinay Sajip
On Jan 2, 11:31 am, koranth...@gmail.com wrote:
> I am confused reading both together. I will try to explain my
> confusion with an example:
>
> basicLogger =logging.getLogger("basic")
>
> Class A():
>   def __init__(self):
>  self.logger =logging.getLogger("basic.class_a")
>
>Now, I make say 10 instances of A and then delete one by one.
>
> My understanding was that since the same name is used, a single
> basic.class_a logger object is created inside theloggingsystem, and
> any calls to getLogger("basic.class_a") would return the same object
> everytime.

That is correct. The logger instances stay around for the life of the
process, and are not garbage collected.

> So, my confusion is based on the second tutorial item I mentioned -
> why is it not a good idea to create logger instances on a per-instance
> basis? We are not creating new instances, right? And, if I create an
> instance of A, it will be garbage collected later, right?
>

It's not a problem to create loggers per *class*, as in your example.
It can be a bad idea to create different logger per class *instances*.
The second example in the docs talks about creating loggers on a per-
connection basis in a networked app. This is not per connection class,
mind you, but per connection instance. You would typically have only a
few dozen classes, but you might have hundreds of thousands of
connection instances created in a long-lived server app. If you
created a unique logger for each connection, for example based on the
time the connection was instantiated - e.g. with name "connection.
20090102123456543", this would create hundreds of thousands of unique
logger instances and have a potentially adverse impact on process
memory. That's when you use LoggerAdapters.

I hope that's clearer.

Regards,

Vinay Sajip
--
http://mail.python.org/mailman/listinfo/python-list


Python logging question

2009-01-02 Thread koranthala
Hi,
I was reading through Python Logging tutorial, and I found one
scenario which I couldnt properly understand.
The tutorial (http://docs.python.org/library/logging.html)
mentions at first that -- "Multiple calls to getLogger() with the same
name will return a reference to the same logger object".

In an example for Python Logging Adapter, the tutorial then
mentions that -
"While it might be tempting to create Logger instances on a per-
connection basis, this is not a good idea because these instances are
not garbage collected".

I am confused reading both together. I will try to explain my
confusion with an example:

basicLogger = logging.getLogger("basic")

Class A():
  def __init__(self):
 self.logger = logging.getLogger("basic.class_a")

   Now, I make say 10 instances of A and then delete one by one.

My understanding was that since the same name is used, a single
basic.class_a logger object is created inside the logging system, and
any calls to getLogger("basic.class_a") would return the same object
everytime.

So, my confusion is based on the second tutorial item I mentioned -
why is it not a good idea to create logger instances on a per-instance
basis? We are not creating new instances, right? And, if I create an
instance of A, it will be garbage collected later, right?

Could somebody help me out?
--
http://mail.python.org/mailman/listinfo/python-list


Python Logging question

2008-04-18 Thread tpatch
I am new to Python and I am trying to understand how to utilize the 
RotatingFileHandler to rollover
when the file gets to a certain size.  I followed some examples that I have 
found for setting the size and
the number of files.  However, I am finding that when the log file gets close 
to the threshold I start getting
errors in the "handlers.py" file saying the file is closed.

I am using Python 2.5 on Windows.

Is this a problem others have seen?
Is this a handler that is widely used or is there a better one that is 
generally used?

The error that I am receiving is shown below.

Traceback (most recent call last):
  File "C:\Python25\Lib\logging\handlers.py", line 73, in emit
if self.shouldRollover(record):
  File "C:\Python25\Lib\logging\handlers.py", line 147, in shouldRollover
self.stream.seek(0, 2)  #due to non-posix-compliant Windows feature
ValueError: I/O operation on closed file


My configuration file is setup as such:

[handler_file_detailed]
class:handlers.RotatingFileHandler
level:DEBUG
formatter:detailed
mode=a
maxsize=400
backcount=5
args:('python.log','a',400,5)

I would appreciate any feedback on this subject.

Thanks,

Todd

-- 
http://mail.python.org/mailman/listinfo/python-list