[issue42838] Wait for cleanup coroutines before event loop is closed.

2022-03-18 Thread xloem


Change by xloem <0xl...@gmail.com>:


--
resolution: wont fix -> out of date

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42838] Wait for cleanup coroutines before event loop is closed.

2022-03-18 Thread xloem


Change by xloem <0xl...@gmail.com>:


--
resolution: out of date -> wont fix

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42838] Wait for cleanup coroutines before event loop is closed.

2022-03-18 Thread xloem


xloem <0xl...@gmail.com> added the comment:

hey, I don't have the capacity to stay on this, but thanks for the attention, 
time, and clear response.

there are of course other situations such as returning a resource to library 
code or manual loop management, but I don't have the use case present any more.

maybe an aatexit library could patch this in if needed.

--
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42838] Wait for cleanup coroutines before event loop is closed.

2022-03-17 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

asyncio.run() present since Python 3.7
After the main coroutine finish, it cancels all background tasks and waits for 
their termination.

Background tasks can use old good `try/finally` for resources cleanup.
An additional API is not required IMHO.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42838] Wait for cleanup coroutines before event loop is closed.

2022-03-17 Thread xloem


xloem <0xl...@gmail.com> added the comment:

I'm sorry, is this closure an error? Could you explain more how to use 
asyncio.run() to clean up resources when an unhandled exception is thrown? Is 
this new with a recent python improvement?

--
status: closed -> open

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42838] Wait for cleanup coroutines before event loop is closed.

2022-03-17 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

asyncio.run() does the task

--
resolution:  -> out of date
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46341] duplicate paragraphs - asyncio Coroutines and Tasks file

2022-03-07 Thread Serhiy Storchaka


Change by Serhiy Storchaka :


--
resolution:  -> duplicate
stage: needs patch -> resolved
status: open -> closed
superseder:  -> Fix incorrect use of directives in asyncio documentation

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46341] duplicate paragraphs - asyncio Coroutines and Tasks file

2022-03-07 Thread Kumar Aditya


Change by Kumar Aditya :


--
versions: +Python 3.11 -Python 3.10

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46341] duplicate paragraphs - asyncio Coroutines and Tasks file

2022-03-07 Thread Kumar Aditya


New submission from Kumar Aditya :

This has been fixed in GH-31388. This can be closed now.

--
nosy: +AlexWaygood, kumaraditya303

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-24 Thread Raymond Hettinger


Raymond Hettinger  added the comment:

[Andrew Svetlov]
> A third-party library should either copy all these 
> implementation details or import a private function from stdlib 

OrderedDict provides just about everything needed to roll lru cache variants.  
It simply isn't true this can only be done efficiently in the standard library.


[Serhiy]
> it would be simpler to add a decorator which wraps the result 
> of an asynchronous function into an object which can be awaited
> more than once:

This is much more sensible.

> It can be combined with lru_cache and cached_property any third-party
> caching decorator. No access to internals of the cache is needed.

Right.  The premise that this can only be done in the standard library was 
false.

> async_lru_cache() and async_cached_property() can be written 
> using that decorator. 

The task becomes trivially easy :-)  


[Andrew Svetlov]
> Pull Request is welcome!

ISTM it was premature to ask for a PR before an idea has been thought through.  
We risk wasting a user's time or committing too early before simpler, better 
designed alternatives emerge.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-24 Thread Joongi Kim


Change by Joongi Kim :


--
nosy: +achimnol

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Serhiy Storchaka


Serhiy Storchaka  added the comment:

async_lru_cache() and async_cached_property() can be written using that 
decorator. The implementation of async_lru_cache() is complicated because the 
interface of lru_cache() is complicated. But it is simpler than using 
_lru_cache_wrapper().

def async_lru_cache(maxsize=128, typed=False):
if callable(maxsize) and isinstance(typed, bool):
user_function, maxsize = maxsize, 128
return lru_cache(maxsize, typed)(reawaitable(user_function))

def decorating_function(user_function):
return lru_cache(maxsize, typed)(reawaitable(user_function))

return decorating_function

def async_cached_property(user_function):
return cached_property(reawaitable(user_function))

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Serhiy Storchaka


Serhiy Storchaka  added the comment:

I think that it would be simpler to add a decorator which wraps the result of 
an asynchronous function into an object which can be awaited more than once:

def reawaitable(func):
@wraps(func)
def wrapper(*args, **kwargs):
return CachedAwaitable(func(*args, **kwargs))
return wrapper

It can be combined with lru_cache and cached_property any third-party caching 
decorator. No access to internals of the cache is needed.

@lru_cache()
@reawaitable
async def coro(...):
...

@cached_property
@reawaitable
async def prop(self):
...

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Tzu-ping Chung


Tzu-ping Chung  added the comment:

Another thing to point out is that existing third-party solutions (both 
alru_cache and cached_property) only work for asyncio, and the stdlib version 
(as implemented now) will be an improvement. And if the position is that the 
improvements should only be submitted to third-party solutions---I would need 
to disagree since both lru_cache and cached_property have third-party solutions 
predating their stdlib implementations, and it is double-standard IMO if an 
async solution is kept out while the sync version is kept in.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-23 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

Thanks, Raymond.

I agree that caching of iterators and generators is out of the issue scope.

Also, I agree that a separate async cache decorator should be added. I prefer 
the `async_lru_cache` (and maybe `async_cache` for the API symmetry). We have 
`contextmanager` and `asynccontextmanager` in contextlib already along with 
`closing` / `aclosing`, `ExitStack` / `AsyncExitStack` etc.

`async_lru_cache` should have the same arguments as accepted by `lru_cache` but 
work with async functions.

I think this function should be a part of stdlib because the implementation 
shares *internal* `_lru_cache_wrapper` that does all dirty jobs (and has C 
accelerator). A third-party library should either copy all these implementation 
details or import a private function from stdlib and keep fingers crossed in 
hope that the private API will keep backward compatibility in future Python 
versions.

Similar reasons were applied to contextlib async APIs.

Third parties can have different features (time-to-live, expiration events, 
etc., etc.) and can be async-framework specific (work with asyncio or trio 
only) -- I don't care about these extensions here.

My point is: stdlib has built-in lru cache support, I love it. Let's add 
exactly the as we have already for sync functions but for async ones.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add an async variant of lru_cache for coroutines.

2022-02-22 Thread Raymond Hettinger


Change by Raymond Hettinger :


--
title: Add a async variant of lru_cache for coroutines. -> Add an async variant 
of lru_cache for coroutines.

___
Python tracker 
<https://bugs.python.org/issue46622>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46622] Add a async variant of lru_cache for coroutines.

2022-02-22 Thread Raymond Hettinger


Raymond Hettinger  added the comment:

If this goes forward, my strong preference is to have a separate async_lru() 
function  just like the referenced external project.

For non-async uses, overloading the current lru_cache makes it confusing to 
reason about. It becomes harder to describe clearly what the caches do or to 
show a pure python equivalent.  People are already challenged to digest the 
current capabilities and are already finding it difficult to reason about when 
references are held.  I don't want to add complexity, expand the docs to be 
more voluminous, cover the new corner cases, and add examples that don't make 
sense to non-async users (i.e. the vast majority of python users).  Nor do I 
want to update the recipes for lru_cache variants to all be async aware.  So, 
please keep this separate (it is okay to share some of the underlying 
implementation, but the APIs should be distinct).

Also as a matter of fair and equitable policy, I am concerned about taking the 
core of a popular external project and putting in the standard library.  That 
would tend to kill the external project, either stealing all its users or 
leaving it as something that only offers a few incremental features above those 
in the standard library.  That is profoundly unfair to the people who created, 
maintained, and promoted the project.

Various SC members periodically voice a desire to move functionality *out* of 
the standard library and into PyPI rather than the reverse.  If a popular 
external package is meeting needs, perhaps it should be left alone.

As noted by the other respondants, caching sync and async iterators/generators 
is venturing out on thin ice.  Even if it could be done reliably (which is 
dubious), it is likely to be bug bait for users.  Remember, we already get 
people who try to cache time(), random() or other impure functions.  They cache 
methods and cannot understand why references is held for the instance.  
Assuredly, coroutines and futures will encounter even more misunderstandings. 

Also, automatic reiterability is can of worms and would likely require a PEP. 
Every time subject has come up before, we've decided not to go there.  Let's 
not make a tool that makes user's lives worse.  Not everything should be 
cached.  Even if we can, it doesn't mean we should.

--
title: Support  decorating a coroutine with functools.lru_cache -> Add a async 
variant of lru_cache for coroutines.

___
Python tracker 
<https://bugs.python.org/issue46622>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44834] contextvars.Context.run w/ coroutines gives inconsistent behavior

2022-02-17 Thread Sebastián Ramírez

Change by Sebastián Ramírez :


--
nosy: +tiangolo

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10, 3.9 docs

2022-02-04 Thread Terry J. Reedy


Change by Terry J. Reedy :


--
nosy:  -miss-islington
resolution:  -> fixed
stage: patch review -> resolved
status: open -> closed
title: Generator-based coroutines in Python 3.10 docs -> Generator-based 
coroutines in Python 3.10, 3.9 docs

___
Python tracker 
<https://bugs.python.org/issue46609>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-04 Thread Terry J. Reedy


Terry J. Reedy  added the comment:


New changeset 459e26f0987a12a19238baba422e13a8f7fcfca3 by Miss Islington (bot) 
in branch '3.9':
[3.9] bpo-46609: Update asyncio-task coroutine doc (GH-31132) 
https://github.com/python/cpython/commit/459e26f0987a12a19238baba422e13a8f7fcfca3


--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-04 Thread Terry J. Reedy


Terry J. Reedy  added the comment:


New changeset 5603db43ba7ba5568b7516d0e28730a2bc1e1f26 by Terry Jan Reedy in 
branch '3.10':
[3.10] bpo-46609: Update asyncio-task coroutine doc (GH-31132)
https://github.com/python/cpython/commit/5603db43ba7ba5568b7516d0e28730a2bc1e1f26


--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-04 Thread miss-islington


Change by miss-islington :


--
nosy: +miss-islington
nosy_count: 5.0 -> 6.0
pull_requests: +29312
pull_request: https://github.com/python/cpython/pull/31133

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-04 Thread Terry J. Reedy


Change by Terry J. Reedy :


--
keywords: +patch
pull_requests: +29311
stage: needs patch -> patch review
pull_request: https://github.com/python/cpython/pull/31132

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-04 Thread Terry J. Reedy


Terry J. Reedy  added the comment:

Looking at the doc answered the question.  Further down the 3.10 version, 
'3.10' was revised to '3.11', and indeed, the decorator and the entire section 
are gone.  There was no change in 3.9 doc, but should be.  I will submit a PR 
for 3.10 and try to remember to make an additional change in the 3.9 backport.

The doc for iscoroutine and ...function are gone because they have no use in 
new 3.11+ code.  The functions remain though, as they will still work, even 
though redundant.

--
stage:  -> needs patch
type:  -> behavior
versions: +Python 3.9

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-04 Thread Terry J. Reedy


Terry J. Reedy  added the comment:

Yuri or Andrew: either of you know the fix for "Support for generator-based 
coroutines is deprecated and is scheduled for removal in Python 3.10."?

--
nosy: +asvetlov, terry.reedy, yselivanov

___
Python tracker 
<https://bugs.python.org/issue46609>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46609] Generator-based coroutines in Python 3.10 docs

2022-02-02 Thread Sebastian Rittau


New submission from Sebastian Rittau :

Currently, the Python 3.10.2 documentation at 
https://docs.python.org/3/library/asyncio-task.html?highlight=coroutine#asyncio.coroutine
 says:

"Note: Support for generator-based coroutines is deprecated and is scheduled 
for removal in Python 3.10."

Python 3.10 still has support for those (although it emits a warning), so the 
note should be updated.

--
assignee: docs@python
components: Documentation
messages: 412352
nosy: docs@python, srittau
priority: normal
severity: normal
status: open
title: Generator-based coroutines in Python 3.10 docs
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue46609>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

You are welcome!

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread Andrew Svetlov


Change by Andrew Svetlov :


--
resolution:  -> not a bug
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread bluecarrot


bluecarrot  added the comment:

You are absolutely correct. Thank you very much!

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

Your version works but can be simplified.

Just use

await writer.drain()  
writer.write(data)

without grabbing the drainer early.
The purpose of the .drain() method is to write pausing if the write buffer side 
is greater than the high watermark. 
The 'await writer.drain()' waits until the buffer size became less than low 
watermark. It prevents uncontrollable write buffer growth if a peer cannot 
accept TCP message as fast as `writer.write()` sends them.
The .drain() call has no hidden process under the hood, there is no need to get 
write_drain reference as early as possible. It is just 'waiting for a flag'.
Also, there is no need for `await write_drain` at the end: asyncio transport 
sends all data from the internal write buffer before closing (and doesn't do it 
on 'transport.abort()').

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread bluecarrot


bluecarrot  added the comment:

Hi Andrew, thank you for your answer. I am experimenting with coroutines, as I 
am pretty new to them. My idea was to let the writer drain while other packets 
where read, and thus I am waiting for the writer_drain right before starting 
writer.write again. Isn't that the correct wait to overlap the readings and the 
writings?

If I modify my initial code to look like:

async def forward_stream(reader: StreamReader, writer: StreamWriter, event: 
asyncio.Event, source: str):
writer_drain = writer.drain()  # <--- awaitable is created here
while not event.is_set():
try:
data = await asyncio.wait_for(reader.read(1024), 1)  # <-- 
CancelledError can be caught here, stack unwinds and writer_drain is never 
awaited, sure.
except asyncio.TimeoutError:
continue
except asyncio.CancelledError:
event.set()
break
 ...  # the rest is not important for this case

await writer_drain

so that in case the task is cancelled, writer_drain will be awaited outside of 
the loop. This works, at the cost of having to introduce code specific for 
testing purposes (which feels wrong). In "production", the workflow of this 
code will be to loose the connection, break out of the loop, and wait for the 
writer stream to finish... but I am not introducing any method allowing me to 
cancel the streams once the script is running.

In the same way leaked tasks are "swallowed", which I have tested and works, 
shouldn't be these cases also handled by the tearDownClass method of 
IsolatedAsyncioTestCase?

--

___
Python tracker 
<https://bugs.python.org/issue46568>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

Your code has at least one concurrency problem. Let's look back at 
forward_stream() function:

async def forward_stream(reader: StreamReader, writer: StreamWriter, event: 
asyncio.Event, source: str):
writer_drain = writer.drain()  # <--- awaitable is created here
while not event.is_set():
try:
data = await asyncio.wait_for(reader.read(1024), 1)  # <-- 
CancelledError can be caught here, stack unwinds and writer_drain is never 
awaited, sure.
except asyncio.TimeoutError:
continue
 ...  # the rest is not important for this case

To solve the problem, you should create writer_drain *before its awaiting*, not 
before another 'await' call.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread bluecarrot


bluecarrot  added the comment:

Seems that, should I add an "await asyncio.sleep(1)" in asyncTearDown, so 
getting

class TestConnections(IsolatedAsyncioTestCase):
async def asyncSetUp(self) -> None:
self.proxy = asyncio.create_task(EnergyAgentProxy(self.proxy_port, 
self.server_port, self.upstream_port))

async def asyncTearDown(self) -> None:
await asyncio.sleep(1)

is enough to "hide the problem under the carpet"... but sounds weird...

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46568] non awaited coroutines on a IsolatedAsyncioTestCase results on a RuntimeWarning

2022-01-29 Thread bluecarrot


New submission from bluecarrot :

I am unittesting a tcp proxy module using coroutines. This is the coroutine I 
am using to do the forwarding, allowing the writer stream to drain while the 
rest of the coroutines are proceeding:

async def forward_stream(reader: StreamReader, writer: StreamWriter, event: 
asyncio.Event, source: str):
writer_drain = writer.drain()
while not event.is_set():
try:
data = await asyncio.wait_for(reader.read(1024), 1)
except asyncio.TimeoutError:
continue

if not data:
event.set()
break

# parse the data
if reading := parse(data):
# wait for the previous write to finish, and forward the data to 
the other end, process the data in between
await writer_drain
writer.write(data)
writer_drain = writer.drain()

# wait for any outstanding write buffer to be flushed
await writer_drain
logger.info("{} reader forwarder finished.".format(source))

In my unit tests, I have the following (EnergyAgentProxy is the wrapper calling 
the coroutine in the module that creates the proxy)

class TestConnections(IsolatedAsyncioTestCase):
async def asyncSetUp(self) -> None:
self.proxy = asyncio.create_task(EnergyAgentProxy(self.proxy_port, 
self.server_port, self.upstream_port))

The problem is: When running these tests, I am getting the following error:
 /usr/lib/python3.10/unittest/async_case.py:159: RuntimeWarning: coroutine 
'StreamWriter.drain' was never awaited
 Coroutine created at (most recent call last)
   File "/usr/lib/python3.10/unittest/case.py", line 650, in __call__
 return self.run(*args, **kwds)
   [...]
   File "/home/frubio/Documents/powermonitor_raspberrypi/EnergyAgent.py", 
line 48, in forward_stream
 writer_drain = writer.drain()
   self._tearDownAsyncioLoop()

So... to me, it looks like when the tasks are being cancelled I am getting this 
warning because the last "await writer_drain" in forward stream is not 
executed, but I cannot ensure that. Am I doing something wrong? Is there any 
way I can just prevent this warning from showing up in my tests?

--
components: Tests, asyncio
messages: 412060
nosy: asvetlov, bluecarrot, yselivanov
priority: normal
severity: normal
status: open
title: non awaited coroutines on a IsolatedAsyncioTestCase results on a 
RuntimeWarning
type: behavior
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue46568>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46341] duplicate paragraphs - asyncio Coroutines and Tasks file

2022-01-12 Thread Alex Waygood


Change by Alex Waygood :


--
nosy: +asvetlov, yselivanov
stage:  -> needs patch
type:  -> behavior

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46341] duplicate paragraphs - asyncio Coroutines and Tasks file

2022-01-11 Thread David


Change by David :


--
assignee: docs@python
components: Documentation
nosy: davem, docs@python
priority: normal
pull_requests: 28731
severity: normal
status: open
title: duplicate paragraphs - asyncio Coroutines and Tasks file
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue46341>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45088] Coroutines & async generators disagree on the iteration protocol semantics

2021-09-02 Thread Yury Selivanov


Change by Yury Selivanov :


--
nosy: +gvanrossum

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45088] Coroutines & async generators disagree on the iteration protocol semantics

2021-09-02 Thread Yury Selivanov


New submission from Yury Selivanov :

See this script:

  https://gist.github.com/1st1/eccc32991dc2798f3fa0b4050ae2461d

Somehow an identity async function alters the behavior of manual iteration 
though the wrapped nested generator.

This is a very subtle bug and I'm not even sure if this is a bug or not. 
Opening the issue so that I don't forget about this and debug sometime later.

--
components: Interpreter Core
messages: 400951
nosy: lukasz.langa, pablogsal, yselivanov
priority: normal
severity: normal
stage: needs patch
status: open
title: Coroutines & async generators disagree on the iteration protocol 
semantics
type: behavior
versions: Python 3.10, Python 3.11

___
Python tracker 
<https://bugs.python.org/issue45088>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44834] contextvars.Context.run w/ coroutines gives inconsistent behavior

2021-08-04 Thread Adrian Garcia Badaracco


Change by Adrian Garcia Badaracco :


--
nosy: +yselivanov

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44834] contextvars.Context.run w/ coroutines gives inconsistent behavior

2021-08-04 Thread Adrian Garcia Badaracco


New submission from Adrian Garcia Badaracco :

I recently tried to use `contextvars.Context.run` w/ coroutines, expecting the 
same behavior as with regular functions, but it seems that 
`contextvars.Context.run` does not work w/ coroutines.

I'm sorry if this is something obvious to do with how coroutines work under the 
hood, if so I'd appreciate some help in understanding why this is the expected 
behavior.

```python
import asyncio
import contextvars


ctxvar = contextvars.ContextVar("ctxvar", default="spam")


def func():
assert ctxvar.get() == "spam"

async def coro():
func()


async def main():
ctx = contextvars.copy_context()
ctxvar.set("ham")
ctx.run(func)  # works
await ctx.run(coro)  # breaks

asyncio.run(main())
```

Thanks!

--
components: Library (Lib)
messages: 398924
nosy: adriangb
priority: normal
severity: normal
status: open
title: contextvars.Context.run w/ coroutines gives inconsistent behavior
type: behavior
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue44834>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44407] A "Coroutines and Tasks" code example needs "asyncio.run(main())"

2021-06-15 Thread Andrei Kulakov


Andrei Kulakov  added the comment:

I think it's clear that the modification of previous example is limited to 
`main()` because `say_after()` is also omitted. This is very common in Python 
code examples, - it seems to me this change is not needed.

--
nosy: +andrei.avk

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44407] A "Coroutines and Tasks" code example needs "asyncio.run(main())"

2021-06-12 Thread Atsushi Sakai

New submission from Atsushi Sakai :

This is very small documentation improvement proposal.

In the "Coroutines and Tasks" doc, the code example after "Let’s modify the 
above example and run two say_after coroutines concurrently:" is missing 
"asyncio.run(main())" at the end of the code example:

async def main():
task1 = asyncio.create_task(
say_after(1, 'hello'))

task2 = asyncio.create_task(
say_after(2, 'world'))

print(f"started at {time.strftime('%X')}")

# Wait until both tasks are completed (should take
# around 2 seconds.)
await task1
await task2

print(f"finished at {time.strftime('%X')}")

--
assignee: docs@python
components: Documentation
messages: 395728
nosy: AtsushiSakai, docs@python
priority: normal
severity: normal
status: open
title: A "Coroutines and Tasks" code example needs "asyncio.run(main())"
type: enhancement
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue44407>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42838] Wait for cleanup coroutines before event loop is closed.

2021-01-06 Thread xloem


New submission from xloem <0xl...@gmail.com>:

To handle destruction of resources especially during exceptions, it would be 
nice if there were some way to provide coroutines/tasks that run at the 
termination or closure of an event loop.  There are a lot of api options here.  
Maybe a simple one would be to have close() call a run_until_closed() function 
that starts all of these and steps the loop to wait for them, before closing, 
and provide run_close() functions to queue them for delay.

--
components: asyncio
messages: 384503
nosy: asvetlov, xloem, yselivanov
priority: normal
severity: normal
status: open
title: Wait for cleanup coroutines before event loop is closed.
type: enhancement
versions: Python 3.10, Python 3.8

___
Python tracker 
<https://bugs.python.org/issue42838>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Re: [Python-ideas] asyncio: return from multiple coroutines

2020-06-25 Thread Kyle Stanley
(Resending this email since it didn't originally go through to
python-list, sorry for the duplicate Pablo)

> Yes, I want to have multiple results: the connections listening forever, 
> returning a result for each message received.

> I forgot to mention thatI did try to use asyncio.wait with `FIRST_COMPLETED`; 
> however, the problem is that it seems to evict the not-completed coroutines, 
> so the messenger that arrives second does not send the message. To check it, 
> I have run that script without the random sleep. just msgr1 waits 1s and 
> msgr2 waits 2s, so msgr1 always ends first. I expect a result like this 
> (which I am currently getting with queues):

FYI, the other coroutines are not evicted or cancelled, they are
simply in the "pending" set of the "done, pending" tuple returned by
`asyncio.wait()` and were not returned. I misunderstood what you were
actually looking for, but I think that I understand now.

Since it seems like you want to be able to receive the results out of
order from both "messengers" and have them continuously listen to
their respective socket (without affecting the other), a queue is
likely going to be the best approach. I think you had the right
general idea with your example, but here's a way to make it
significantly less cumbersome and easy to expand upon (such as
expanding the number of websockets to listen to):

```
import asyncio

class MessengerQueue:
def __init__(self):
self._queue = asyncio.Queue()

async def get(self):
return await self._queue.get()

async def start_messengers(self):
# Could be easily modified to create any number of
"messengers" set to
# listen on specific websockets, by passing a list and
creating a task for
# each one.
asyncio.create_task(self._messenger("Messenger 1", 1))
asyncio.create_task(self._messender("Messenger 2", 2))

async def _messenger(self, message: str, sleep_time: int):
while True:
await asyncio.sleep(sleep_time)
await self._queue.put(f'{message} awaited for {sleep_time:.2f}s')


async def main():
mqueue = MessengerQueue()
asyncio.create_task(mqueue.start_messengers())
while True:
result = await mqueue.get()
print(result)

asyncio.run(main())
```

This results in your desired output:
Messenger 1 awaited for 1.00s
Messenger 2 awaited for 2.00s
Messenger 1 awaited for 1.00s
Messenger 1 awaited for 1.00s
Messenger 2 awaited for 2.00s
Messenger 1 awaited for 1.00s
Messenger 1 awaited for 1.00s
Messenger 2 awaited for 2.00s

Note: it would probably be more idiomatic to call these "consumers" or
"listeners" rather than "messengers"/"messagers" (the websocket docs
refer to them as "consumer handlers"), but I used "messengers" to make
it a bit more easily comparable to the original queue example from the
OP: https://pastebin.com/BzaxRbtF.

I hope the above example is of some use. :-)

Regards,
Kyle Stanley

On Thu, Jun 25, 2020 at 1:28 AM Kyle Stanley  wrote:
>
> > Yes, I want to have multiple results: the connections listening forever, 
> > returning a result for each message received.
>
> > I forgot to mention thatI did try to use asyncio.wait with 
> > `FIRST_COMPLETED`; however, the problem is that it seems to evict the 
> > not-completed coroutines, so the messenger that arrives second does not 
> > send the message. To check it, I have run that script without the random 
> > sleep. just msgr1 waits 1s and msgr2 waits 2s, so msgr1 always ends first. 
> > I expect a result like this (which I am currently getting with queues):
>
> FYI, the other coroutines are not evicted or cancelled, they are
> simply in the "pending" set of the "done, pending" tuple returned by
> `asyncio.wait()` and were not returned. I misunderstood what you were
> actually looking for, but I think that I understand now.
>
> Since it seems like you want to be able to receive the results out of
> order from both "messengers" and have them continuously listen to
> their respective socket (without affecting the other), a queue is
> likely going to be the best approach. I think you had the right
> general idea with your example, but here's a way to make it
> significantly less cumbersome and easy to expand upon (such as
> expanding the number of websockets to listen to):
>
> ```
> import asyncio
>
> class MessengerQueue:
> def __init__(self):
> self._queue = asyncio.Queue()
>
> async def get(self):
> return await self._queue.get()
>
> async def start_messengers(self):
> # Could be easily modified to create any number of
> "messengers" set to
> # listen on specific websocke

Re: [Python-ideas] asyncio: return from multiple coroutines

2020-06-23 Thread Pablo Alcain
Thank you very much Kyle for your answer, I am moving this conversation to
the more proper python-list for whoever wants to chime in. I summarize here
the key points of my original question (full question on the quoted email):

I have an application that listens on two websockets through the async
library https://websockets.readthedocs.io/ and I have to perform the same
function on the result, no matter where the message came from. I have
implemented a rather cumbersome solution with async Queues:
https://pastebin.com/BzaxRbtF, but i think there has to be a more
async-friendly option I am missing.

Now I move on to the comments that Kyle made

On Tue, Jun 23, 2020 at 12:32 AM Kyle Stanley  wrote:

> I believe asyncio.wait() with "return_when=FIRST_COMPLETED" would
> perform the functionality you're looking for with the
> "asyncio.on_first_return()". For details on the functionality of
> asyncio.wait(), see
> https://docs.python.org/3/library/asyncio-task.html#asyncio.wait.
>
> I understand that I can create two coroutines that call the same
> function, but it would be much cleaner (because of implementation issues)
> if I can simply create a coroutine that yields the result of whichever
> connection arrives first.
>
> You can use an asynchronous generator that will continuously yield the
> result of the first recv() that finishes (I'm assuming you mean
> "yields" literally and want multiple results from a generator, but I
> might be misinterpreting that part).
>

Yes, I want to have multiple results: the connections listening forever,
returning a result for each message received.


>
> Here's a brief example, using the recv() coroutine function from the
> pastebin linked:
>
> ```
> import asyncio
> import random
>
> async def recv(message: str, max_sleep: int):
> sleep_time = max_sleep * random.random()
> await asyncio.sleep(sleep_time)
> return f'{message} awaited for {sleep_time:.2f}s'
>
> async def _start():
> while True:
> msgs = [
> asyncio.create_task(recv("Messager 1", max_sleep=1)),
> asyncio.create_task(recv("Messager 2", max_sleep=1))
> ]
> done, _ = await asyncio.wait(msgs,
> return_when=asyncio.FIRST_COMPLETED)
> result = done.pop()
> yield await result
>
> async def main():
> async for result in _start():
> print(result)
>
> asyncio.run(main())
> ```


I forgot to mention thatI did try to use asyncio.wait with
`FIRST_COMPLETED`; however, the problem is that it seems to evict the
not-completed coroutines, so the messenger that arrives second does not
send the message. To check it, I have run that script without the random
sleep. just msgr1 waits 1s and msgr2 waits 2s, so msgr1 always ends first.
I expect a result like this (which I am currently getting with queues):

Messenger 1 waits for 1.0s
Messenger 1 waits for 1.0s
Messenger 2 waits for 2.0s
Messenger 1 waits for 1.0s
Messenger 1 waits for 1.0s
Messenger 2 waits for 2.0s
Messenger 1 waits for 1.0s
...

but instead I got this:

Messenger 1 waits for 1.0s
Messenger 1 waits for 1.0s
Messenger 1 waits for 1.0s
Messenger 1 waits for 1.0s
Messenger 1 waits for 1.0s
...





> Note that in the above example, in "msgs", you can technically pass
> the coroutine objects directly to asyncio.wait(), as they will be
> implicitly converted to tasks. However, we decided to deprecate that
> functionality in Python 3.8 since it can be rather confusing. So
> creating and passing the tasks is a better practice.
>

Thanks for that info, I am still trying to grasp the best practices
surrounding mostly the explicitness in async.


> > Again, it's quite likely I am not seeing something obvious, but I didn't
> know where else to ask.
>
> If you're not mostly certain or relatively inexperienced with the
> specific area that the question pertains to, I'd recommend asking on
> python-list first (or another Python user community). python-ideas is
> primarily intended for new feature proposals/suggestions. Although if
> you've tried other resources and haven't found an answer, it's
> perfectly fine to ask a question as part of the suggestion post.
>
>
>
Original question, as posted in python-ideas:


> On Mon, Jun 22, 2020 at 6:24 PM Pablo Alcain 
> wrote:
> >
> > Hey everyone. I have been looking into asyncio lately, and even though I
> have had my fair share of work, I still have some of it very shaky, so
> first of all forgive me if what I am saying here is already implemented and
> I totally missed it (so far, it looks *really* likely).
> >
> > Basically this is the situation: I have an application that listens on
> two websockets through the async library
&

[issue40844] Alternate ways of running coroutines

2020-06-02 Thread Yury Selivanov


Change by Yury Selivanov :


--
resolution:  -> rejected
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40844] Alternate ways of running coroutines

2020-06-02 Thread Yury Selivanov


Yury Selivanov  added the comment:

> I'm suggesting a method on coroutines that runs them without blocking, and 
> will run a callback when it's complete.

And how would that method be implemented? Presumably the event loop would 
execute the coroutine, but that API is already there, and it's called 
create_task.  We will not be adding a builtin method for this.

You can use Task.add_done_callback() to add a callback.

--

___
Python tracker 
<https://bugs.python.org/issue40844>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40844] Alternate ways of running coroutines

2020-06-02 Thread Matthew Francis


New submission from Matthew Francis <4576fran...@gmail.com>:

Currently, using await inside a coroutine will block inside the coroutine.  
This behavior would usually be fine, but for some usecases a way to 
nonblockingly run coroutines without creating a Task could be useful, because 
tasks don't allow for a callback.  I'm suggesting a method on coroutines that 
runs them without blocking, and will run a callback when it's complete.

--
components: asyncio
messages: 370614
nosy: asvetlov, matthewfrancis, yselivanov
priority: normal
severity: normal
status: open
title: Alternate ways of running coroutines
type: enhancement
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue40844>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread Batuhan Taskaya


Batuhan Taskaya  added the comment:

> This can be closed, but for completeness, the test you ran didn't verify that 
> the bug was fixed. This is because the hard coded compile flags I gave in my 
> example seem to have changed in Python 3.9 (is this documented?). 

Yes, this is documented on What's New.

> Which does produce the correct output as expected. So, the issue can remain 
> closed. I am curious what the bug in 3.9.0a5 was though if you have any 
> speculations.

See issue 39562

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread Jonathan Crall


Jonathan Crall  added the comment:

This can be closed, but for completeness, the test you ran didn't verify that 
the bug was fixed. This is because the hard coded compile flags I gave in my 
example seem to have changed in Python 3.9 (is this documented?). 

In python3.8 the compile flags I specified correspond to division, 
print_function, unicode_literals, and absolute_import. 

python3.8 -c "import __future__; print(__future__.print_function.compiler_flag 
| __future__.division.compiler_flag | __future__.unicode_literals.compiler_flag 
| __future__.absolute_import.compiler_flag)"


Results in: 221184


In Python 3.9 the same code results in: 3538944


I can modify the MWE to accommodate these changes: 

./python -c "import __future__; print(eval(compile('[i for i in range(3)]', 
mode='eval', filename='fo', flags=__future__.print_function.compiler_flag | 
__future__.division.compiler_flag | __future__.unicode_literals.compiler_flag | 
__future__.absolute_import.compiler_flag)))"


Which does produce the correct output as expected. So, the issue can remain 
closed. I am curious what the bug in 3.9.0a5 was though if you have any 
speculations.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread STINNER Victor


STINNER Victor  added the comment:

> I first noticed this when testing xdoctest on Python 3.9, and then again when 
> using IPython.

What is your Python 3.9 exact version number?

I cannot reproduce your issue with Python 3.9.0a6:

vstinner@apu$ ./python -c "print(eval(compile('[i for i in range(3)]', 
mode='eval', filename='foo', flags=221184)))"
Traceback (most recent call last):
  File "", line 1, in 
ValueError: compile(): unrecognised flags

vstinner@apu$ ./python -VV
Python 3.9.0a6+ (heads/master:9a8c1315c3, Apr 29 2020, 17:03:41) 
[GCC 9.3.1 20200408 (Red Hat 9.3.1-2)]

--
nosy: +vstinner

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread STINNER Victor


STINNER Victor  added the comment:

I close the issue: it's already fixed in 3.9.0a6. If it's not the case, feel 
free to reopen the issue ;-)

--
resolution:  -> fixed
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread Jonathan Crall


Jonathan Crall  added the comment:

Ah, sorry. I neglected all the important information. 

I tested this using: 

Python 3.9.0a5 (default, Apr 23 2020, 14:11:34) 
[GCC 8.3.0]


Specifically, I ran in a docker container: 

DOCKER_IMAGE=circleci/python:3.9-rc
docker pull $DOCKER_IMAGE
docker run --rm -it $DOCKER_IMAGE bash

And then in the bash shell in the docker image I ran:

python -c "print(eval(compile('[i for i in range(3)]', mode='eval', 
filename='foo', flags=221184)))"

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread Batuhan Taskaya


Change by Batuhan Taskaya :


--
nosy: +BTaskaya

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40438] Python 3.9 eval on list comprehension sometimes returns coroutines

2020-04-29 Thread Jonathan Crall


New submission from Jonathan Crall :

I first noticed this when testing xdoctest on Python 3.9, and then again when 
using IPython.

I was finally able to generate a minimal working example in Python itself. The 
following code:

python -c "print(eval(compile('[i for i in range(3)]', mode='eval', 
filename='foo', flags=221184)))"

produces 

[0, 1, 2]
 
in Python <= 3.8, but in 3.9 it produces: 

 at 0x7fa336d40ec0>
:1: RuntimeWarning: coroutine '' was never awaited
RuntimeWarning: Enable tracemalloc to get the object allocation traceback


Is this an intended change? I can't find any notes in the CHANGELOG that seem 
to correspond to it.

--
components: Interpreter Core
messages: 367651
nosy: Jonathan Crall
priority: normal
severity: normal
status: open
title: Python 3.9 eval on list comprehension sometimes returns coroutines
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue40438>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue39034] Documentation: Coroutines

2019-12-13 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

I suggest thinking that gather() implicitly creates tasks for running 
coroutines.
The documentation is technically correct, enumeration of all potentially 
endless list of things that calls `create_task()` for wrapping passed coroutine 
is not very helpful, especially for the beginning document.

--

___
Python tracker 
<https://bugs.python.org/issue39034>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue39034] Documentation: Coroutines

2019-12-12 Thread Karthikeyan Singaravelan


Change by Karthikeyan Singaravelan :


--
components: +asyncio
nosy: +asvetlov, yselivanov

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue39034] Documentation: Coroutines

2019-12-12 Thread Rustam Agakishiev

New submission from Rustam Agakishiev :

Here: https://docs.python.org/3/library/asyncio-task.html  
it says:"To actually run a coroutine, asyncio provides three main mechanisms:"

and a few pages down it gives you a fourth mechanism:
"awaitable asyncio.gather(*aws, loop=None, return_exceptions=False)
Run awaitable objects in the aws sequence concurrently."

And it really runs awaitables:
future = asyncio.gather(*awslist) # aws are run...
...   # some other heavy tasks
result = async future # collect results

Shouldn't it be added to docs?

--
assignee: docs@python
components: Documentation
messages: 358320
nosy: agarus, docs@python
priority: normal
severity: normal
status: open
title: Documentation: Coroutines
type: enhancement
versions: Python 3.7

___
Python tracker 
<https://bugs.python.org/issue39034>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Awaiting coroutines inside the debugger (PDB)

2019-11-24 Thread darthdeus
Hi everyone,

this question is sort of in response to this issue
https://github.com/gotcha/ipdb/issues/174. I've noticed that there was
recently added support for an asyncio repl (run via `python -m
asyncio`), which allows the use of `await` directly on coroutine objects
in the REPL.

But this does not seem to work when in PDB (and consequently in iPDB)
when running `import pdb; pdb.set_trace()`.

Is there any workaround to get `await` working in PDB, or any simple way
to synchronously wait while in the debugger?
-- 
https://mail.python.org/mailman/listinfo/python-list


[issue30773] async generator receives wrong value when shared between coroutines

2019-09-30 Thread Yury Selivanov


Change by Yury Selivanov :


--
resolution:  -> fixed
stage: patch review -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2019-09-30 Thread Yury Selivanov


Yury Selivanov  added the comment:


New changeset 2f87a7dc5a1ad7f37787f0adee242c931643f878 by Yury Selivanov (Miss 
Islington (bot)) in branch '3.8':
bpo-30773: Fix ag_running; prohibit running athrow/asend/aclose in parallel 
(GH-7468) (#16486)
https://github.com/python/cpython/commit/2f87a7dc5a1ad7f37787f0adee242c931643f878


--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2019-09-30 Thread Yury Selivanov


Yury Selivanov  added the comment:


New changeset fc4a044a3c54ce21e9ed150f7d769fb479d34c49 by Yury Selivanov in 
branch 'master':
bpo-30773: Fix ag_running; prohibit running athrow/asend/aclose in parallel 
(#7468)
https://github.com/python/cpython/commit/fc4a044a3c54ce21e9ed150f7d769fb479d34c49


--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2019-09-30 Thread miss-islington


Change by miss-islington :


--
pull_requests: +16072
pull_request: https://github.com/python/cpython/pull/16486

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2019-05-29 Thread Andrew Svetlov


Change by Andrew Svetlov :


--
resolution:  -> rejected
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2019-05-29 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

decimal was changed from threading.local to contextvar usage.
The module is "safe" not only for asyncio but for threading, trio etc.

unittest.mock doesn't use explicit context all for patching.
It changes global interpreter-wide objects instead.

So, mock.patch fails not only if two async tasks are executed in parallel but 
two threads also.

I doubt if thread-local (or contextvar) can be applied to mock because it 
changes the current behavior -- but this is a different story.

*Any* library that needs to modify a global state, e.g. your MyLogger.enabled 
can use contextvars for handling it.

Say again, contextvars is not for asyncio-only but a generic instrument for 
handling context-aware variables.

I'm going to close the issue.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue35040] functools.lru_cache does not work with coroutines

2018-10-22 Thread Andrew Svetlov


Andrew Svetlov  added the comment:

A coroutine detection is a relatively slow check.
I don't think we need to do it in `functools.lru_cache`.

There is a specialized asyncio compatible version: 
https://github.com/aio-libs/async_lru
Please use it.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue35040] functools.lru_cache does not work with coroutines

2018-10-21 Thread Karthikeyan Singaravelan


Change by Karthikeyan Singaravelan :


--
nosy: +xtreak

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue35040] functools.lru_cache does not work with coroutines

2018-10-21 Thread Liran Nuna


New submission from Liran Nuna :

lru_cache is a very useful method but it does not work well with coroutines 
since they can only be executed once.

Take for example, the attached code (test-case.py) - It will throw a 
RuntimeError because you cannot reuse an already awaited coroutine.

A solution would be to call `asyncio.ensure_future` on the result of the 
coroutine if detected.

--
components: asyncio
files: test-case.py
messages: 328228
nosy: Liran Nuna, asvetlov, yselivanov
priority: normal
severity: normal
status: open
title: functools.lru_cache does not work with coroutines
versions: Python 3.5, Python 3.6, Python 3.7, Python 3.8
Added file: https://bugs.python.org/file47887/test-case.py

___
Python tracker 
<https://bugs.python.org/issue35040>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Re: asyncio await different coroutines on the same socket?

2018-10-05 Thread Russell Owen
On Oct 3, 2018, Ian Kelly wrote
(in 
article):

> On Wed, Oct 3, 2018 at 7:47 AM Russell Owen  wrote:
> > Using asyncio I am looking for a simple way to await multiple events where
> > notification comes over the same socket (or other serial stream) in
> > arbitrary
> > order. For example, suppose I am communicating with a remote device that can
> > run different commands simultaneously and I don't know which command will
> > finish first. I want to do this:
> >
> > coro1 = start(command1)
> > coro2 = start(command2)
> > asyncio.gather(coro1, coro2)
> >
> > where either command may finish first. I’m hoping for a simple and
> > idiomatic way to read the socket and tell each coroutine it is done. So far
> > everything I have come up with is ugly, using multiple layers of "async
> > def”, keeping a record of Tasks that are waiting and calling "set_result"
> > on those Tasks when finished. Also Task isn’t even documented to have the
> > set_result method (though "future" is)
>
> Because Tasks are used to wrap coroutines, and the result of the Task
> should be determined by the coroutine, not externally.
>
> Instead of tracking tasks (that's what the event loop is for) I would
> suggest tracking futures instead. Have start(command1) return a future
> (or create a future that it will await on itself) that is not a task.
> Whenever a response from the socket is parsed, that code would then
> look up the corresponding future and call set_result on it. It might
> look something like this:
>
> class Client:
> async def open(self, host, port):
> self.reader, self.writer = await asyncio.open_connection(host, port)
> asyncio.create_task(self.read_loop())
>
> async def read_loop(self):
> while not self.reader.at_eof():
> response = self.reader.read()
> id = get_response_id(response)
> self._futures.pop(id).set_result(response)
>
> def start(self, command):
> future = asyncio.Future()
> self._futures[get_command_id(command)] = future
> self.writer.write(command)
> return future
>
> In this case start() is not a coroutine but its result is a future and
> can be awaited.

That is exactly what I was looking for. Thank you very much!

-- Russell

(My apologies for double posting -- I asked this question again today because 
I did not think my original question -- this one -- had gone through).


-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio await different coroutines on the same socket?

2018-10-03 Thread Ian Kelly
On Wed, Oct 3, 2018 at 7:47 AM Russell Owen  wrote:
> Using asyncio I am looking for a simple way to await multiple events where
> notification comes over the same socket (or other serial stream) in arbitrary
> order. For example, suppose I am communicating with a remote device that can
> run different commands simultaneously and I don't know which command will
> finish first. I want to do this:
>
> coro1 = start(command1)
> coro2 = start(command2)
> asyncio.gather(coro1, coro2)
>
> where either command may finish first. I’m hoping for a simple and
> idiomatic way to read the socket and tell each coroutine it is done. So far
> everything I have come up with is ugly, using multiple layers of "async
> def”, keeping a record of Tasks that are waiting and calling "set_result"
> on those Tasks when finished. Also Task isn’t even documented to have the
> set_result method (though "future" is)

Because Tasks are used to wrap coroutines, and the result of the Task
should be determined by the coroutine, not externally.

Instead of tracking tasks (that's what the event loop is for) I would
suggest tracking futures instead. Have start(command1) return a future
(or create a future that it will await on itself) that is not a task.
Whenever a response from the socket is parsed, that code would then
look up the corresponding future and call set_result on it. It might
look something like this:

class Client:
async def open(self, host, port):
self.reader, self.writer = await asyncio.open_connection(host, port)
asyncio.create_task(self.read_loop())

async def read_loop(self):
while not self.reader.at_eof():
response = self.reader.read()
id = get_response_id(response)
self._futures.pop(id).set_result(response)

def start(self, command):
future = asyncio.Future()
self._futures[get_command_id(command)] = future
self.writer.write(command)
return future

In this case start() is not a coroutine but its result is a future and
can be awaited.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio await different coroutines on the same socket?

2018-10-03 Thread Léo El Amri via Python-list
Hello Russell,

On 03/10/2018 15:44, Russell Owen wrote:
> Using asyncio I am looking for a simple way to await multiple events where 
> notification comes over the same socket (or other serial stream) in arbitrary 
> order. For example, suppose I am communicating with a remote device that can 
> run different commands simultaneously and I don't know which command will 
> finish first. I want to do this:
> 
> coro1 = start(command1)
> coro2 = start(command2)
> asyncio.gather(coro1, coro2)
> 
> where either command may finish first. I’m hoping for a simple and 
> idiomatic way to read the socket and tell each coroutine it is done. So far 
> everything I have come up with is ugly, using multiple layers of "async 
> def”, keeping a record of Tasks that are waiting and calling "set_result" 
> on those Tasks when finished. Also Task isn’t even documented to have the 
> set_result method (though "future" is)
I don't really get what you want to achieve. Do you want to signal other
coroutines that one of the others finished ?

From what I understand, you want to have several coroutines reading on
the same socket "simultaneously", and you want to stop all of them once
one of them is finished. Am I getting it right ?

-- 
Léo
-- 
https://mail.python.org/mailman/listinfo/python-list


asyncio await different coroutines on the same socket?

2018-10-03 Thread Russell Owen

Using asyncio I am looking for a simple way to await multiple events where 
notification comes over the same socket (or other serial stream) in arbitrary 
order. For example, suppose I am communicating with a remote device that can 
run different commands simultaneously and I don't know which command will 
finish first. I want to do this:

coro1 = start(command1)
coro2 = start(command2)
asyncio.gather(coro1, coro2)

where either command may finish first. I’m hoping for a simple and 
idiomatic way to read the socket and tell each coroutine it is done. So far 
everything I have come up with is ugly, using multiple layers of "async 
def”, keeping a record of Tasks that are waiting and calling "set_result" 
on those Tasks when finished. Also Task isn’t even documented to have the 
set_result method (though "future" is)

Is there a simple, idiomatic way to do this?

-- Russell


-- 
https://mail.python.org/mailman/listinfo/python-list


[issue30773] async generator receives wrong value when shared between coroutines

2018-09-22 Thread Karthikeyan Singaravelan


Change by Karthikeyan Singaravelan :


--
nosy: +xtreak

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue29471] AST: add an attribute to FunctionDef to distinguish functions from generators and coroutines

2018-09-19 Thread STINNER Victor


STINNER Victor  added the comment:

> I'm not sure we need this feature TBH.

Ok, I close the issue.

--
resolution:  -> wont fix
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34701] Asyncio documentation for recursive coroutines is lacking

2018-09-15 Thread Azaria Zornberg


Azaria Zornberg  added the comment:

Ah, thanks for the clarification!

I first encountered this when having some issues with converting large objects 
to json. json.dumps happens synchronously, and when executed on an object that 
was dozens of MB in size, it held up everything for a fair bit of time.
I tried to solve it by recursively running json.dumps on smaller pieces of the 
thing being converted to json. And that was when I realized that this still 
wasn't letting other things get scheduled.

When I looked for examples online, I didn't see any of a recursive asyncio 
coroutine, which is why I assumed the recursion was the issue.


Any advice on better ways to phrase the documentation are greatly appreciated! 
Alternatively, it sounds like you have a much better understanding of this than 
I do, so I'm happy to defer to whatever you believe is the correct way to 
document this. Thanks for the help!

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34701] Asyncio documentation for recursive coroutines is lacking

2018-09-15 Thread Yury Selivanov


Yury Selivanov  added the comment:

The issue here is not the recursion,  but rather about the fact that coroutines 
should actually await on IO or other activity in order for the event loop to 
run them cooperatively.  E.g.

   async def foo():
await foo()

doesn't really do anything expect calling itself, whereas

   async def foo():
await sleep(0)
await foo()

is asking the event loop to sleep for a moment and then recurses into itself.

I'm OK with better clarifying this in the asyncio-dev.rst file.

--

___
Python tracker 
<https://bugs.python.org/issue34701>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34701] Asyncio documentation for recursive coroutines is lacking

2018-09-15 Thread Azaria Zornberg


Change by Azaria Zornberg :


--
keywords: +patch
pull_requests: +8762
stage:  -> patch review

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue34701] Asyncio documentation for recursive coroutines is lacking

2018-09-15 Thread Azaria Zornberg


New submission from Azaria Zornberg :

When an asynchronous coroutine in asyncio awaits or yields from itself, any 
call to the function is executed somewhat synchronously.

Once the recursive coroutine begins, if it never awaits any other coroutines 
besides itself, nothing else will be scheduled to run until it has completely 
finished recursively calling itself and returning.
However, if it ever awaits a different coroutine (even something as small as 
asyncio.sleep(0)) then other coroutines will be scheduled to run.

It seems, from other documentation, that this is intentional. Other 
documentation sort of dances around the specifics of how coroutines work with 
recursion, and only examples of coroutines yielding from each other recursively 
are provided.

However, this behavior is never explicitly called out. This is confusing for 
people who write a recursive asyncio coroutine and are perplexed by why it 
seems to execute synchronously, assuming they ever notice.

I've attached a short script that can be run to exhibit the behavior.
A PR is going to be filed shortly against the python 3.7 branch (as the 
documentation page for asyncio in 3.8 does not fully exist right now).

--
assignee: docs@python
components: Documentation, asyncio
files: asyncio_await_from_self_example.py
messages: 325468
nosy: asvetlov, azaria.zornberg, docs@python, yselivanov
priority: normal
severity: normal
status: open
title: Asyncio documentation for recursive coroutines is lacking
type: enhancement
versions: Python 3.4, Python 3.5, Python 3.6, Python 3.7
Added file: https://bugs.python.org/file47805/asyncio_await_from_self_example.py

___
Python tracker 
<https://bugs.python.org/issue34701>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-27 Thread Liran Nuna


Liran Nuna  added the comment:

> That's still doable with contextvars. You just need a custom mock-like object 
> (or library) that stores its settings/state in a context variable.

contextvars only work with asyncio, what about the iterator case?

In addition, you can't possibly expect authors to re-implement a library just 
because it may or may not be used with asyncio. In my example, re-implementing 
mock/patch is quite a task just to get such basic functionality.

In other words, contextvars don't solve this issue, it just creates new issues 
to solve and causes code duplication.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-20 Thread Yury Selivanov


Yury Selivanov  added the comment:

> Imagine the context manager is mock.patch used in testing and you want to run 
> two tests in "parallel", each with a different mocked method. mock.patch 
> isn't aware of `await` so patching will be incorrect.

That's still doable with contextvars. You just need a custom mock-like object 
(or library) that stores its settings/state in a context variable.

Now, the "mock" module doesn't provide this functionality out of the box, but I 
hope that somebody will come up with a new mock library that will work that way 
(or with a new mock primitive) after 3.7.0 is released.

Adding __pause__ and __resume__ was considered in PEP 521, and it was decided 
that the actual implementation will be too slow and complex.  It's very 
unlikely that PEP 521 is ever accepted.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-20 Thread Liran Nuna


Liran Nuna  added the comment:

> You should try to use the contextvars module that was specifically created to 
> handle local context state (for tasks & coroutines).

Yury, from my original report:

> I'm aware that this particular problem could be solved with the new context 
> variables introduced with python3.7, however it is just a simplification of 
> our actual issue.

Not everything can use context managers. Imagine the context manager is 
mock.patch used in testing and you want to run two tests in "parallel", each 
with a different mocked method. mock.patch isn't aware of `await` so patching 
will be incorrect.

Those are just some behaviors where context variables don't solve the issue I'm 
describing.

--

___
Python tracker 
<https://bugs.python.org/issue33918>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-20 Thread Yury Selivanov


Yury Selivanov  added the comment:

You should try to use the contextvars module that was specifically created to 
handle local context state (for tasks & coroutines).

--

___
Python tracker 
<https://bugs.python.org/issue33918>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-20 Thread Liran Nuna


Liran Nuna  added the comment:

I would like to stress this issue happens with iterators as well, and this 
isn't a unique issue to asyncio only. 

I would like to propose four new magic methods for context managers to solve 
this: __pause__, __resume__, __apause__ and __aresume__ which will be called 
before/after a pause/resume happen before the coroutine/iterator continues.

I'm not sure however if this is the correct venue for such discussion.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-20 Thread Ned Deily


Change by Ned Deily :


--
components: +asyncio -Interpreter Core
nosy: +asvetlov, yselivanov
versions:  -Python 3.4, Python 3.5

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33918] Hooking into pause/resume of iterators/coroutines

2018-06-20 Thread Liran Nuna


New submission from Liran Nuna :

An interesting property of async programming is that execution order is 
nondeterministic and async function "pause" and "resume" execution as events 
come in.

This can play havok with context managers, especially ones that wrap a global 
state change. I can best explain it with code - see attached file.

If you were to run this, you'd notice that "Should be logged" does not get 
logged - this is because the execution order runs the context manager 
immediately and that affects the entire batch (created by asyncio.gather).

Is there a way to hook into a pause/resume handling of coroutines so this kind 
of thing could be done correctly? I'm aware that this particular problem could 
be solved with the new context variables introduced with python3.7, however it 
is just a simplification of our actual issue.

Iterators also suffer from this issue, as `yield` pauses and resumes execution.

--
components: Interpreter Core
files: async_context_managers.py
messages: 320101
nosy: Liran Nuna
priority: normal
severity: normal
status: open
title: Hooking into pause/resume of iterators/coroutines
type: behavior
versions: Python 3.4, Python 3.5, Python 3.6, Python 3.7, Python 3.8
Added file: https://bugs.python.org/file47646/async_context_managers.py

___
Python tracker 
<https://bugs.python.org/issue33918>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2018-06-06 Thread Yury Selivanov


Change by Yury Selivanov :


--
keywords: +patch
pull_requests: +7091
stage:  -> patch review

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-29 Thread Yury Selivanov


Yury Selivanov  added the comment:

Yes, thanks Ned

--
resolution:  -> fixed
stage: patch review -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-29 Thread Ned Deily


Ned Deily  added the comment:

Can we close this now?

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov


Yury Selivanov  added the comment:


New changeset e151f83deab9819fb8d9dfc59f9baa4a7273226c by Yury Selivanov in 
branch '3.6':
bpo-33672: Fix Task.__repr__ crash with Cython's bogus coroutines (GH-7180)
https://github.com/python/cpython/commit/e151f83deab9819fb8d9dfc59f9baa4a7273226c


--

___
Python tracker 
<https://bugs.python.org/issue33672>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov


Yury Selivanov  added the comment:

Ned, this one would be nice to have in 3.7.0.

--
nosy: +ned.deily
priority: normal -> release blocker

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov


Change by Yury Selivanov :


--
pull_requests: +6815

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov


Yury Selivanov  added the comment:


New changeset 075c662086859f864aa1179f57367aa470ee6335 by Yury Selivanov (Miss 
Islington (bot)) in branch '3.7':
bpo-33672: Fix Task.__repr__ crash with Cython's bogus coroutines (GH-7161) 
(GH-7173)
https://github.com/python/cpython/commit/075c662086859f864aa1179f57367aa470ee6335


--

___
Python tracker 
<https://bugs.python.org/issue33672>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Ned Deily


Ned Deily  added the comment:

bug fix, go for it.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread miss-islington


Change by miss-islington :


--
pull_requests: +6808

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov


New submission from Yury Selivanov :


New changeset 989b9e0e6d7dd2fa911f9bfd4744e7f3a82d6006 by Yury Selivanov in 
branch 'master':
bpo-33672: Fix Task.__repr__ crash with Cython's bogus coroutines (GH-7161)
https://github.com/python/cpython/commit/989b9e0e6d7dd2fa911f9bfd4744e7f3a82d6006


--

___
Python tracker 
<https://bugs.python.org/issue33672>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov

Change by Yury Selivanov :


--
keywords: +patch
pull_requests: +6796
stage:  -> patch review

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue33672] Fix Task.__repr__ crash when trying to format Cython's bogus coroutines

2018-05-28 Thread Yury Selivanov

Change by Yury Selivanov <yseliva...@gmail.com>:


--
components: asyncio
nosy: asvetlov, yselivanov
priority: normal
severity: normal
status: open
title: Fix Task.__repr__ crash when trying to format Cython's bogus coroutines
type: behavior
versions: Python 3.6, Python 3.7, Python 3.8

___
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue33672>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2018-05-24 Thread Yury Selivanov

Yury Selivanov  added the comment:

Thanks, I'll look into adding ag_running properly.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2018-05-24 Thread Nathaniel Smith

Nathaniel Smith  added the comment:

My thoughts: https://bugs.python.org/issue32526#msg309783

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue30773] async generator receives wrong value when shared between coroutines

2018-05-24 Thread Yury Selivanov

Change by Yury Selivanov :


--
assignee:  -> yselivanov
components: +Interpreter Core -asyncio
priority: normal -> high
versions: +Python 3.8

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



  1   2   3   4   >