One small clarification:

On Tue, May 5, 2015 at 12:40 PM, Jim J. Jewett <jimjjew...@gmail.com> wrote:

> [...] but I don't understand how this limitation works with things like a
> per-line file iterator that might need to wait for the file to
> be initially opened.
>

 Note that PEP 492 makes it syntactically impossible to use a coroutine
function to implement an iterator using yield; this is because the
generator machinery is needed to implement the coroutine machinery.
However, the PEP allows the creation of asynchronous iterators using
classes that implement __aiter__ and __anext__. Any blocking you need to do
can happen in either of those. You just use `async for` to iterate over
such an "asynchronous stream".

(There's an issue with actually implementing an asynchronous stream mapped
to a disk file, because I/O multiplexing primitives like select() don't
actually support waiting for disk files -- but this is an unrelated
problem, and asynchronous streams are useful to handle I/O to/from network
connections, subprocesses (pipes) or local RPC connections. Checkout the
streams <https://docs.python.org/3/library/asyncio-stream.html> and
subprocess <https://docs.python.org/3/library/asyncio-subprocess.html>
submodules of the asyncio package. These streams would be great candidates
for adding __aiter__/__anext__ to support async for-loops, so the idiom for
iterating over them can once again closely resemble the idiom for iterating
over regular (synchronous) streams using for-loops.)

-- 
--Guido van Rossum (python.org/~guido)
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to