I have a class with a method meant to verify internal program logic (not data supplied by the caller). Because it is time-consuming but optional, I treat it as a complex assertion statement, and optimize it away if __debug__ is false:
class Parrot: def __init__(self, *args): print "Initialising instance..." if __debug__: self.verify() # check internal program state, not args if __debug__: def verify(self): print "Verifying..." If I run Python normally, I can do this: >>> p = Parrot() Initialising instance... Verifying... >>> p.verify() Verifying... and if I run Python with the -O flag, I get this: >>> p = Parrot() Initialising instance... >>> p.verify() Traceback (most recent call last): File "<stdin>", line 1, in <module> AttributeError: Parrot instance has no attribute 'verify' This is the behaviour I want, but I haven't seen it before in other code. What do people think? Is it a good idea or a bad? If you think it is a bad idea to have verify disappear under optimization, would you change your mind if it were called __verify instead? One possible alternative is to do something like this: class Parrot: def __init__(self, *args): print "Initialising instance..." if __debug__: self.verify() def verify(self): if __debug__: print "Verifying..." return None # this is optional else: warnings.warn("verify() is a null op") which now means that Parrot instances will always have a verify method, even if it does nothing. I'm not sure I like that. What do others think? Which do you consider better design? -- Steven -- http://mail.python.org/mailman/listinfo/python-list