I'm trying to create an object which works like a generator and delegates to a 
generator for its implementation, but can also participate in yield from using 
3.3 beta.  I want my wrapper object to be able to cache some additional 
information - such as whether or not the generator has completed - as well as 
have it provide some additional methods for interacting with the state of the 
generator.  But currently this doesn't seem possible because raising 
StopIteration from a user defined iterator has its value ignored as far as 
yield from is concerned.  Here's a simplified example of the problem:

class C:
    def __iter__(self): return self
    def __next__(self):
            raise StopIteration(100)


def g():
    if False:
        yield 100
    return 100

def f(val):
    x = yield from val
    print('x', x)

print(list(f(C())))
print(list(f(g())))

Which outputs:
x None
[]
x 100
[]

So you can see for the C case the value I raise from StopIteration is ignored, 
but the value from the generator is propagated out.  From my reading of PEP 380 
the behavior here is incorrect for the user defined iterator case.  
next(iter(C())) raises StopIteration exception with a value and that should be 
the resulting value of the yield from expression according to the formal 
semantics.

Ok, looking at the implementation this seems to be because PyIter_Next clears 
the exception which prevents it from being seen in the yield from code path.   
So should yield from just be doing "(*iter->ob_type->tp_iternext)(iter);" 
directly and avoid the error checking code?  Or am I wrong and this is the 
intended behavior?




_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to