On Sun, 03 Jul 2005 19:19:05 +0000, Bengt Richter wrote: > On Sun, 03 Jul 2005 11:47:07 +1000, Steven D'Aprano <[EMAIL PROTECTED]> wrote: > >>On Fri, 01 Jul 2005 12:59:20 -0400, François Pinard wrote: >> >>> [Peter Hansen] >>>> Mike Meyer wrote: >>>> > Yes. I once grabbed an old program that did assignments to None. But >>>> > that's always been a bad idea. >>> >>>> What was the use case!? >>> >>> People used to assign None to itself as a keyword argument in function >>> headers. The goal was to make a local copy of the reference, which was >>> then accessed faster than the global thing. >> >>Can you say "premature optimization is the root of all evil"? >> >>I'd like to see the profiling that demonstrated that this made a >>significant -- or even measurable -- speed-up in anything but the most >>unusual cases. >> > The difference between local and global access is definitely measurable, > though > there's no reason to use None as the local name if you want to do that kind > of optimization (not possible in 2.4+) [snip] > about 25% longer to get a global (AND bind it locally, which the two timings > share) > than to do the same for a local, it seems.
Sure. And if you are actually looping over one million bindings to your local variable, and doing NOTHING else, you may approach a 25% time saving. That is, one of the unusual cases I mentioned. But in real world usage, the 25% saving in fetching the variable once or twice is almost certainly lost in the noise of the rest of your code. Saving 25% of 0.0000001 second running time in a function that takes 0.0001 second in total to run is pointless. That's why I asked about the profiling. I'd like to see what sort of real world function got enough real benefit from setting None=None to make up for the misuse of the language. -- Steven. -- http://mail.python.org/mailman/listinfo/python-list