Alex Martelli wrote:

> for x in whatever_other_iterable: yield x
>
> into (say)
>
> yield from whatever_other_iterable
>
> is minute and not worth changing the syntax (even though something like
> 'yield from' would mean no keywords would need to be added).

I agree that the improvement is minor, but I'll take what I can get.
Although, I can think that perhaps there could be a potential
efficiency improvement as well - right now, each result has to crawl
its way up the stack of yields. For an 8 x 8 board, each final result
gets yielded 8 times. A 'yield from' might potentially be able to
splice the results directly into the output of the generator using some
kind of iterator chain logic. I'm not sure how feasible this is,
however.

A more compelling benefit would be some means of yielding a value
across multiple stack frames. Currently the generator semantics are
limited, in that they only allow co-routine behavior between a caller
and its directly called subroutine.

What I am more interested in, however, is the use of backtracking
search in Python, and its application to more complex search problems.
The main benefit of the generator approach is in the elimination of
callbacks, allowing the consumer of the search results to retain its
local state in a convenient way, (Anyone who's ever struggled with the
SAX API for XML parsing knows what I am talking about.)

One difference between these more complex problems and the simple
example I posted is that it isn't just simple recursion at work. Each
level of search may invoke a variety of different search strategies for
each sub-part of the problem, which themselves will do the same with
their own portion.

Here's a thought experiment: What would it take to make Python the
premier language for AI research? (Not that I am proposing that this
should be the Python community's goal, this is more in the way of a
brainstorm session.)

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to