> Now, instead of keeping that special status, it was decided to make > it a reserved word since there's a new use case in Python 2.6 for > it as well - catching exceptions: > > >>> try: > ... 1/0 > ... except Exception as exc_object: > ... print exc_object > ... > integer division or modulo by zero > > The Python developers always try very hard not to introduce new > keywords to the language, but every now and then, it's better to > go with the addition and cause some breakage.
Thank you for finally answering my original question. If the above use (or that of with), for implementation reasons, *requires* "as" to be a keyword, then I can understand their decision to break Python. But what I can't understand is the decision to break 2.6 instead of 3.0. 2.x was supposed to remain backwards compatible, with the thinking that 2.x would be maintained in parallel for quite some time. 3.x was supposed to be the compatibility break. Not so, apparently. > In this case, it's easy enough to find the files that break. > Just run compileall.py on all your files and Python 2.6 will tell > you which ones need fixing: > > python2.6 -c 'import compileall;compileall.compile_dir(".")' But that assumes source code is a closed set. Unfortunately, in our case, our code actually writes new Python code itself in the form of document-like-objects intended for future re-execution. In that sense, our code base is an open set relying upon Python's backward compatibility (albeit in a very limited ways, but no source code can be immune to introduction of new language keywords). > This idea of CPython not being threads-capable is FUD. CPython > works perfectly well in multi-threaded environments. With all due respect, calling CPython out on the fact that it delivers the efficiency of only one single CPU core even when there are N Python threads running with N-1 idle cores available on a system is not misleading FUD. It is the truth. With all due respect, calling CPython out on the fact that its developer-users cannot even work around this problem elegantly by instantiating multiple independent Python interpreters running concurrently within a single process (with limited but well defined channels of communication between them) is not misleading FUD. It is also the truth. Given that the next round of high-end workstations will likely have 12-16 cores, N CPython native threads will be more than an order of magnitude (>10 fold) less efficient than N Python-like threads on a true threads-capable VM like the CLR. 3-4 years later, it will be 100-fold less efficient, and on and on, in a 1/(2^T) geometric rate of declining performance. That is near-term reality, not misleading FUD. That the powers-that-be consider the above situation working "perfectly well in multi-threaded environments" is rather telling. That the CPython 3.0 compatibility-break was finalized without resolution of CPython's thread performance issues should absolutely give rise to well-founded Fear, Uncertainty, and Doubt about the utility of the CPython 3.0 VM implementation in the minds of current users who must deliver performant software for a living. > There are, of course, situations where using a multi-threaded approach > is not necessarily the right way to approach a problem. Yes, we have been lectured about this endlessly. We are told that threads are bad for various reasons, that they aren't ever the right solution, that we should be using shared memory, separate processes, or native code, or that real multithreading would break CPython library compatibility (!) and so on, all despite the fact that threads work perfectly fine in other VMs, including some VMs which run Python dialects. Threads are indeed the optimal solution to certain problems, and those problems are still not solvable with CPython 3.0. Is it too much to hold out hope for a native Pythonic solution to the multithreading performance issues inside of the CPython VM itself? Only time will tell... but time is rapidly running out. Warren -- http://mail.python.org/mailman/listinfo/python-list