Steven D'Aprano wrote: > But in this specific instance, I don't see any advantage to explicitly > testing the length of a list. Antoon might think that is sufficiently > polymorphic, but it isn't. He cares whether the object has zero _length_, > but for true polymorphism, he should be caring about whether the object is > _empty_. Not all empty objects have zero length, or even a length at all. > (E.g. binary trees.)
Conversely, not all objects that have length consider zero-length to be false. Whether you test with "if a:" or "if len(a)>0", some objects are going to be denied. Thing is, objects that don't have length have almost no overlapping uses with lists (i.e., you'd hardly ever write a function that could take an int or a list, unless you type check the argument or use only object protocol stuff like id and getattr, or pass it to another function that does the same). Iterators do have overlapping uses with lists, but the "if a:" doesn't work for them, so it's moot. OTOH, objects that have length but don't consider zero-length to be false (numpy arrays) do have overlapping uses with lists. So, as a practical matter, I'd be inclined to side with Antoon on this issue, even if it only increases polymorphism for certain people. "if a:" almost never increases polymorphism because almost no lengthless objects would work in that function anyways. Even if you don't use numpy arrays, there's little practical benefit of "if a:" except to save typing. If you do use numpy, it limits polymorphism. "if len(a)>0" does increase polymorphism because it allows for objects that have length but don't equate empty to false. P.S. binary trees do have length: it's the number of nodes, just as the number of keys is the length of a dict. I can't think of any objects that use indexing but don't have a length, except for poorly-implemented proxy objects. It's possible to define such types, but would that be a Pythonic use of indexing? Carl Banks -- http://mail.python.org/mailman/listinfo/python-list