Thomas Bellman wrote:

Torsten Bronger <[EMAIL PROTECTED]> wrote:


Just to amplify Thomas' statements...

...

And inflexibility will always make some situations horribly
daunting to get out of.

Powerful constructs like these can, in some cases, enable a
skilled package writer to design an API that reduces the amount
of boiler plate code needed for using the package.


...
The following is from the (draft of) my paper for my pycon presentation (upcoming).


We see similar patterns throughout the class-type redesign, the introduction of the __new__ method which allowed for “hooking” the creation (as distinct from initialization) of an object, the __getattribute__ method for hooking all attribute access. In each case, the operations were previously “part of the interpreter” and unavailable for customization by the Python programmer, the protocols generalized the behavior, making objects themselves responsible for what was previously implemented by the interpreter, and thus was always exactly the same.

That sameness was arguably a strong advantage for Python in the early days of its life-cycle. When you were handed an object (particularly, any object defined by a class), it had simple, predictable behavior. The (one) class implementation was straightforward enough that most new users didn't particularly get lost in it. Of course, the flip side was the presence of “types”, which could behave very differently, and the need for people who wanted to customize the core behavior of an object (for instance to change how attribute traversal was handled) to write C extensions. And then there was the strange exception for attribute access for functions, but that was so natural and practical it seldom bit anyone who wasn't doing meta-programming.

Effectively, there were a few bits of “magic” that the Python programmer needed to understand, but the variety and complexity of the magical patterns was limited. The magic incantations could be learned readily by the acolyte, but there was a hard cap on what could be done without becoming a “wizard” and programming in C. The limits on functionality made Python a smaller and less complex language, but it also tended to frustrate the “wizards”, who, after all, might know C, but would rather either program in Python (or spend more of their time looking at really deep C, rather than creating yet another customization to support something just beyond the reach of the acolytes).

So, with Python 2.2, the wizards of Python made it possible to override and customize the behavior of objects in ways that were previously impossible. They brought things that were previously “magic” (for example, the order of attribute lookup that somehow created bound instance methods) into the realm of the understandable by rationalizing the process.

metaclasses and descriptors are primarily of interest to metaprogrammers, people whose target audience is other programmers. Most programmers can get along perfectly fine ignoring them save for knowing what the particular ones they *use* do. Metaprogramming is not *necessarily* a *dark* art, but it can certainly lead the way to darkness if done improperly or wantonly. When done properly, you can build systems that seem to perfectly map a problem domain into the familiar embrace of everyday Python with just the right "lived in" touches to make it seem as though the language was designed from the start to support that domain [leaving aside any questions of syntactic variants, which is its own involved debate].

When you have metaprogramming to do, it is *such* a relief when you can get it done in Python without resorting to C.

Have fun all,
Mike

________________________________________________
 Mike C. Fletcher
 Designer, VR Plumber, Coder
 http://www.vrplumber.com
 http://blog.vrplumber.com
                             PyCon is coming...

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to