On Sat, 12 Mar 2005, Steven Bethard wrote:
The goals behind this seem a lot like the goals of PEP 288[1].  I
remember discussions suggesting code like:

def gen():
   a, b, c=3 = yield 1
   yield a + b*c

g = gen()
print g.next() # prints 1
print g.next(1, 2) # prints 7

But as you can see, this creates a weird asymmetry because the last
yield throws away its arguments, and depending on how the generator is
written, different calls to next may require a different number of
arguments.  This means that, unless the code is extremely well
documented, you have to read the source code for the generator to know
how to call it.

The intention of my proposal was for using generators with 'for' loops. In this case, the generator runs to completion, so the arguments to the last yield are never thrown away. If 'next' were not able to take any arguments, that would be compatible with my proposal.


Also, there was the issue that there is an asymmetry because the first call to 'next' does not take any arguments. This asymmetry does not exist, however, when using the generator in a 'for' loop, because there is no "first" call to 'continue' in such a case.

Because of these and other complications, I believe the PEP is now
lobbying for a way to get the generator instance object and a way to
cause an exception to be thrown from outside the generator.  Take a
look and see if the PEP might meet your needs -- I haven't seen much
action on it recently, but it seems much less invasive than your
proposal...

This PEP solves similar problems, yes. And I would agree that my proposal is much more invasive on python's implementation. From the users' point of view, however, I think it is much less invasive. For example, no doubt there will be many users who write a generator that is to be used in a 'for' loop and are baffled that they receive a syntax error when they try to write some try/finally cleanup code. With the PEP, they would have to figure out that they have to use the 'throw' method of generators to trigger cleanup code (and then have to remember to call it each time they are done with the generator). With this proposal, try/finally would just work as they expect and they would be non the wiser.


def pickled_file(name):
   self = mygen.get_instance()
   f = open(name, 'r')
   yield pickle.load(f)
   f.close()
   f = open(name, 'w')
   pickle.dump(self.l, f)
   f.close()


And this would be written something like:

gen = pickled_file('greetings.pickle')
for l in gen:
   l.append('hello')
   l.append('howdy')
   gen.l = l

Personally, I find this use of a generator thoroughly confusing, and I
don't see what you gain from it.  The PEP 288 examples are perhaps
somewhat more convincing though...

The disadvantage of doing it this way (or with a class wrapping the generator) is that it is implicit. If I were reading the pickled_file code, I would have no idea where the self.l comes from. If it is coming from the 'for' loop, why not just be able to explicitly say that?


I agree that this is a confusing way to use generators. But it is the expected way to use "code blocks" as found in other languages. It would take some getting used to that 'for' can be used this way, but I think it would be worth it.

-Brian
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to