[Python-ideas] Re: Entrypoint function for modules (AKA if __name__ == '__main__' ) with built-in argument parsing

2019-07-30 Thread Andrew Barnert via Python-ideas
On Jul 30, 2019, at 21:18, Guido van Rossum  wrote:
> 
> The problem with recommending click or fire is that it doesn't look to me 
> like any of those are category killers the way requests became years ago.

Yeah, that’s true.

And honestly, I can’t imagine how any of them _could_ become a category killer, 
because the category is just so broad; there are just too many things you 
sometimes want to be simpler or more automated than argparse, and they 
sometimes even conflict with each other. (For example, fire calling 
literal_eval on args is great for a lot of quick&dirty scripts, but would be 
disastrous for many other programs.)

So… never mind.
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/SGHV3KSUMPC6YLP2JHJAK3JR7QXY52Y2/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: asyncio: futures and tasks with synchronous callbacks

2019-07-30 Thread Guido van Rossum
I wonder if Nathaniel has something to add? Trio has a different approayto
cancellation. Of course it is also an entirely new library...

On Tue, Jul 30, 2019 at 9:51 PM  wrote:

> Oh only now it appears in the list ! I thought the post hadn't working, so
> I posted again :/.
>
> I've fixed my "library" (
> https://github.com/aure-olli/aiokafka/blob/3acb88d6ece4502a78e230b234f47b90b9d30fd5/syncio.py),
> and the `wrapped_consumer2` function. Now no double await, so no risk of
> afterward cancellation.
>
> stop_future = asyncio.Future()
> async def wrapped_consumer2():
> task = asyncio.ensure_future(consume_data())
> try:
> await asyncio.wait([task, stop_future])
> finally:
> task.cancel()
> if not task.cancelled():
> return task.result()
> else:
> raise RuntimeError('stopped')
>
> Or
>
> import syncio
>
> async def wrapped_consumer():
> task = syncio.ensure_sync_future(consume_data())
> return await task
>
> stop_future = asyncio.Future()
> async def wrapped_consumer2():
> task = syncio.ensure_sync_future(consume_data())
> try:
> await syncio.sync_wait([task, stop_future])
> finally:
> task.cancel()
> if not task.cancelled():
> return task.result()
> else:
> raise RuntimeError('stopped')
>
> My only concern is that consistent cancellation state is currently almost
> impossible with futures and tasks. The only two possibilities are either to
> ignore cancellation (highly counter intuitive to use), or to directly
> manipulate the coroutine as a generator (basically rewriting asyncio).
>
> Could be another library for sure, but the current state of asyncio makes
> it really hard to reuse it. So it would mean copy-pasting the whole library
> while changing few lines here and there.
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/4GROXASMFVVRG3UDB4LVMDXOQPU3KH5V/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
-- 
--Guido (mobile)
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/VMRGZQT6TX5TPEB2OKLLRMREW3SZCXNF/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: asyncio: futures and tasks with synchronous callbacks

2019-07-30 Thread aurelien . lambert . 89
Oh only now it appears in the list ! I thought the post hadn't working, so I 
posted again :/.

I've fixed my "library" 
(https://github.com/aure-olli/aiokafka/blob/3acb88d6ece4502a78e230b234f47b90b9d30fd5/syncio.py),
 and the `wrapped_consumer2` function. Now no double await, so no risk of 
afterward cancellation.

stop_future = asyncio.Future()
async def wrapped_consumer2():
task = asyncio.ensure_future(consume_data())
try:
await asyncio.wait([task, stop_future])
finally:
task.cancel()
if not task.cancelled():
return task.result()
else:
raise RuntimeError('stopped')

Or

import syncio

async def wrapped_consumer():
task = syncio.ensure_sync_future(consume_data())
return await task

stop_future = asyncio.Future()
async def wrapped_consumer2():
task = syncio.ensure_sync_future(consume_data())
try:
await syncio.sync_wait([task, stop_future])
finally:
task.cancel()
if not task.cancelled():
return task.result()
else:
raise RuntimeError('stopped')

My only concern is that consistent cancellation state is currently almost 
impossible with futures and tasks. The only two possibilities are either to 
ignore cancellation (highly counter intuitive to use), or to directly 
manipulate the coroutine as a generator (basically rewriting asyncio).

Could be another library for sure, but the current state of asyncio makes it 
really hard to reuse it. So it would mean copy-pasting the whole library while 
changing few lines here and there.
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/4GROXASMFVVRG3UDB4LVMDXOQPU3KH5V/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] asyncio futures and tasks with synchronous callbacks

2019-07-30 Thread Aurélien Lambert
In asyncio, when a task awaits for another task (or future), it can be
cancelled right after the awaited task finished. Thus, if the awaited task
has consumed data, the data is lost.

For instance, with the following code:

import asyncio

available_data = []
data_ready = asyncio.Future()

def feed_data(data):
global data_ready
available_data.append(data)
data_ready.set_result(None)
data_ready = asyncio.Future()

async def consume_data():
while not available_data:
await asyncio.shield(data_ready)
return available_data.pop()

async def wrapped_consumer():
task = asyncio.ensure_future(consume_data())
return await task

If I perform those exact steps:

async def test():
task = asyncio.ensure_future(wrapped_consumer())
await asyncio.sleep(0)
feed_data('data')
await asyncio.sleep(0)
task.cancel()
await asyncio.sleep(0)
print ('task', task)
print ('available_data', available_data)

loop = asyncio.get_event_loop()
loop.run_until_complete(test())

Then I can see that the task has been cancelled despite the data being
consumed. Since the result of `wrapped_consumer` cannot be retrieved, the
data is forever lost.

task :17>>
available_data []

This side effect does not happen when awaiting a coroutine, but coroutines
are not as flexible as tasks (unless manipulated as a generator). It
happens when awaiting a `Future`, a `Task`, or any function like
`asyncio.wait`, `asyncio.wait_for` or `asyncio.gather` (which all inherit
from or use `Future`). There is then no way to do anything equivalent to:

stop_future = asyncio.Future()
async def wrapped_consumer2():
task = asyncio.ensure_future(consume_data())
try:
await asyncio.wait([task, stop_future])
finally:
task.cancel()
if not task.cancelled():
return task.result()
else:
raise RuntimeError('stopped')

This is due to the Future calling the callback asynchronously:
https://github.com/python/cpython/blob/3.6/Lib/asyncio/futures.py#L214

for callback in callbacks:
self._loop.call_soon(callback, self)

I propose to create synchronous versions of those, or a
`synchronous_callback` parameter, that turns the callbacks of `Future`
synchronous. I've experimented a simple library `syncio` with CPython 3.6
to do this (it is harder to patch later versions due to the massive use of
private methods).
Basically, needs to:
1) replace the `Future._schedule_callbacks` method by a synchronous version
2) fix `Task._step` to not fail when cleaning `_current_tasks` (
https://github.com/python/cpython/blob/3.6/Lib/asyncio/tasks.py#L245)
3) rewrite all the functions to use synchronous futures instead of normal
ones

With that library, the previous functions are possible and intuitive

import syncio

async def wrapped_consumer():
task = syncio.ensure_sync_future(consume_data())
return await task

stop_future = asyncio.Future()
async def wrapped_consumer2():
task = syncio.ensure_sync_future(consume_data())
try:
await syncio.sync_wait([task, stop_future])
finally:
task.cancel()
if not task.cancelled():
return task.result()
else:
raise RuntimeError('stopped')

No need to use `syncio` anywhere else in the code, which makes it totally
transparent for the end user. `wrapped_consumer` and `wrapped_consumer2`
are now cancelled if and only if the data hasn't been consumed, whatever is
the order of the steps (and the presence of `asyncio.sleep`).

This "library" can be found here:
https://github.com/aure-olli/aiokafka/blob/3acb88d6ece4502a78e230b234f47b90b9d30fd5/syncio.py
It implements `SyncFuture`, `SyncTask`, `ensure_sync_future`, `sync_wait`,
`sync_wait_for`, `sync_gather` and `sync_shield`. It works with CPython 3.6.

To conclude:
- asynchronous callbacks are preferable in most cases, but do not provide a
coherent cancelled status in specific cases
- implementing a version with synchronous callback (or a
`synchronous_callback` parameter) is rather easy (however step 2 need to be
clarified, probably a cleaner way to fix this)
- it is totally transparent for the end user, as synchronous callbacks are
totally compatible with asynchronous ones
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/NQ2BO4INGNSFYN2LAKWI2ISGUPEREI5L/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Entrypoint function for modules (AKA if __name__ == '__main__' ) with built-in argument parsing

2019-07-30 Thread Guido van Rossum
Ouch, the tutorial definitely should no longer recommend getopt (or
optparse) and just stick to argparse (or sys.argv[1:] if you want simple
:-). I filed newcomer-friendly issue https://bugs.python.org/issue37726.

The problem with recommending click or fire is that it doesn't look to me
like any of those are category killers the way requests became years ago.

On Tue, Jul 30, 2019 at 9:07 PM Andrew Barnert via Python-ideas <
python-ideas@python.org> wrote:

> On Jul 30, 2019, at 20:49, Andrew Barnert via Python-ideas <
> python-ideas@python.org> wrote:
> >
> > There are lots of great third-party libraries that make turning a
> function into a command-line utility a lot easier than using argparse. I
> think whenever you want anything more than argv and don’t want argparse,
> you should probably just use one of those libraries.
>
> That being said, maybe the docs should make that recommendation somewhere?
>
> For example, just as urllib.request points you to requests if you want a
> higher-level HTTP client, argparse could point you to click if you want
> something a little more automated and/or fire if you want something dead
> simple for a quick&dirty script. (Not necessarily those two; those are just
> the ones I’m most familiar with…)
>
> Or maybe there could be a mention in the tutorial, in the section on
> command-line arguments, which currently mentions getopt as the simple
> choice and argparse as the fancy one, but maybe it could instead mention
> fire, click, and argparse. (The getopt library is only really good when
> you’re familiar with C getopt and happen to be thinking in exactly those
> terms—as its own docs say up-top. So I’m not sure it’s a good idea for the
> tutorial to even mention it…)
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/OEXGSBYWG5JGR4G2ELFVJUGUVWWIN52Q/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-- 
--Guido van Rossum (python.org/~guido)
*Pronouns: he/him/his **(why is my pronoun here?)*

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/NFUMB4RA3AKQAPHRMKVNMDDDNP6T2GMT/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Entrypoint function for modules (AKA if __name__ == '__main__' ) with built-in argument parsing

2019-07-30 Thread Andrew Barnert via Python-ideas
On Jul 30, 2019, at 20:49, Andrew Barnert via Python-ideas 
 wrote:
> 
> There are lots of great third-party libraries that make turning a function 
> into a command-line utility a lot easier than using argparse. I think 
> whenever you want anything more than argv and don’t want argparse, you should 
> probably just use one of those libraries.

That being said, maybe the docs should make that recommendation somewhere?

For example, just as urllib.request points you to requests if you want a 
higher-level HTTP client, argparse could point you to click if you want 
something a little more automated and/or fire if you want something dead simple 
for a quick&dirty script. (Not necessarily those two; those are just the ones 
I’m most familiar with…)

Or maybe there could be a mention in the tutorial, in the section on 
command-line arguments, which currently mentions getopt as the simple choice 
and argparse as the fancy one, but maybe it could instead mention fire, click, 
and argparse. (The getopt library is only really good when you’re familiar with 
C getopt and happen to be thinking in exactly those terms—as its own docs say 
up-top. So I’m not sure it’s a good idea for the tutorial to even mention it…)
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/OEXGSBYWG5JGR4G2ELFVJUGUVWWIN52Q/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Entrypoint function for modules (AKA if __name__ == '__main__' ) with built-in argument parsing

2019-07-30 Thread Andrew Barnert via Python-ideas
On Jul 30, 2019, at 15:21, agustinscaramu...@gmail.com wrote:
> 
> Maybe the def __main__() argument is already a dead horse, given the number 
> of discussions it has created that have ended nowhere, but I think one 
> argument in favour of its implementation would be including argument parsing 
> in it, for example:
> 
> # main.py
> def __run__(first_num, second_num, print_operation=False):

Couldn’t you get nearly the same effect in Python today just by adding one line:

__run__(*sys.argv[1:])

That doesn’t do any fancy parsing of the arguments, but, except for the -h 
magic, neither does your proposal.

Also:

> $ python main.py -h

Does —help also work? (It’s pretty weird that you use a long arg for verbose 
but a short arg for help.)

> $ python main.py 1 2 --verbose
> 1 + 2 = 3

But if I run the same thing in the more traditional way:

$ python main.py —verbose 1 2

… I’ll get an error, because the first argument isn’t an int and can’t be added 
to 1.

Also:

$ python main.py 1 2 —quiet

… does the same thing as —verbose, which is pretty misleading.

Not that I haven’t done the quick hacky “this script isn’t going to be used by 
anyone but me, and only for the next two days, and it only needs one flag, so 
just assume any third arg is that flag” thing. But that’s hardly something you 
want a language feature to actively encourage.

There are lots of great third-party libraries that make turning a function into 
a command-line utility a lot easier than using argparse. I think whenever you 
want anything more than argv and don’t want argparse, you should probably just 
use one of those libraries.

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/IE7OLODPNLBMCZO3DBACOF3GCGLAD3V7/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Entrypoint function for modules (AKA if __name__ == '__main__' ) with built-in argument parsing

2019-07-30 Thread Ryan Gonzalez
I think you might find plac[1] and fire[2] rather interesting. I do feel an
explicit __run__ would go against the spirit of "explicit is better than
implicit" a bit...

[1] https://micheles.github.io/plac/
[2] https://github.com/google/python-fire


On Tue, Jul 30, 2019, 8:03 PM  wrote:

> Maybe the def __main__() argument is already a dead horse, given the
> number of discussions it has created that have ended nowhere, but I think
> one argument in favour of its implementation would be including argument
> parsing in it, for example:
>
> # main.py
> def __run__(first_num, second_num, print_operation=False):
> """
> Adds two numbers.
>
> positional arguments:
> - first_num: the first number.
> - second_num: the second number.
>
> optional arguments:
> - print_operation: prints the sum operation to stdout.
> """
> result = int(first_num) + int(second_num)
> if print_operation:
> print(f'{first_num} + {second_num} = {result}')
> else:
> print(result)
>
> $ python main.py -h
> Adds two numbers.
>
> positional parameters:
> - first_num: the first number.
> - second_num: the second number.
>
> optional arguments:
> - print_operation: prints the sum operation to stdout.
>
> $ python main.py 1 2 --verbose
> 1 + 2 = 3
>
> $ python main.py 1 2
> 3
>
> The -h flag would print the function docstring. We could also add type
> hints for casting (I'm not entirely sure how feasible this is):
>
> def __run__(first_num: int, second_num: int, print_operation=False):
> ...
> result = first_num + second_num  # no need for casting here
> ...
>
> Since __main__ is already a built-in function, I abstained from using it
> as the designated function name (I picked __run__, but any other
> suggestions would be welcome, also thought of __entrypoint__, __exec__).
> Thoughts?
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/IKTXZMTKGXWJB2IUZCWZNJEQARTAMWYB/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/I6JQ2JL2SKH7BFCSLGT235FEIQMX2VTK/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Entrypoint function for modules (AKA if __name__ == '__main__' ) with built-in argument parsing

2019-07-30 Thread agustinscaramuzza
Maybe the def __main__() argument is already a dead horse, given the number of 
discussions it has created that have ended nowhere, but I think one argument in 
favour of its implementation would be including argument parsing in it, for 
example:

# main.py
def __run__(first_num, second_num, print_operation=False):
"""
Adds two numbers.

positional arguments:
- first_num: the first number.
- second_num: the second number.

optional arguments:
- print_operation: prints the sum operation to stdout.
"""
result = int(first_num) + int(second_num)
if print_operation:
print(f'{first_num} + {second_num} = {result}')
else:
print(result)

$ python main.py -h
Adds two numbers.

positional parameters:
- first_num: the first number.
- second_num: the second number.

optional arguments:
- print_operation: prints the sum operation to stdout.

$ python main.py 1 2 --verbose
1 + 2 = 3

$ python main.py 1 2
3

The -h flag would print the function docstring. We could also add type hints 
for casting (I'm not entirely sure how feasible this is):

def __run__(first_num: int, second_num: int, print_operation=False):
...
result = first_num + second_num  # no need for casting here
...

Since __main__ is already a built-in function, I abstained from using it as the 
designated function name (I picked __run__, but any other suggestions would be 
welcome, also thought of __entrypoint__, __exec__). Thoughts?
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/IKTXZMTKGXWJB2IUZCWZNJEQARTAMWYB/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Eric V. Smith


> On Jul 30, 2019, at 11:38 AM, Guido van Rossum  wrote:
> 
...

> 
>  with connect() as stream:  # connect() or __enter__() can fail.
>  for data in stream:  # __next__() can fail
>  write(data)  # write() can fail
>  
> This very much looks like toy networking code to me -- real networking code 
> is always written in a completely different way, using different abstractions 
> that raise different exceptions, and I don't think it would be written in 
> this style even if `with` and `for` had optional `except` clauses. 
> (Alternatively, you can write little wrappers that turn OSError into 
> different exceptions so you can use a single try/except/except/except 
> statement to handle them all.)

This is what I do: wrappers with custom exceptions. I think it gives the most 
readable code. But I definitely do not need this pattern very often. 

Eric

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/GEWDX7R54SS7KUXUWJVNLKPA6VFZHMMY/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Greg Ewing

Guido van Rossum wrote:
I'm not sure I understand the desire to catch every possible exception 
right where it occurs. I've never felt this need somehow.


This is my feeling too. I can't remember having a need for such
needle-sharp targeting of exception catching, and if I ever did,
it would be such a rare thing that I wouldn't mind writing things
in a different way to achieve it.

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/UDQ7MBX3VAZDWWSWWSE2HBUOQMDG3SLK/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Greg Ewing

Something I don't like about these kinds of proposals is that
the except clause is far removed from the code that it covers,
hurting readability. By the time you get to the end of a big
for-loop or with-statement and see an "except", it's easy to
forget that it isn't attached to an ordinary try-statement.

I would be happier if there were some way to get the except
clause at the top instead of the bottom, but it's hard to
think of a nice way to do that.

--
Greg
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/E24GNVZXY3MH4FNIYTVBWEAOOTKERMUX/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Guido van Rossum
On Tue, Jul 30, 2019 at 1:13 AM Paul Moore  wrote:

> On Tue, 30 Jul 2019 at 04:00, Ethan Furman  wrote:
> > If this "with...except" leads to more robust code then I think many
> would like to use it, but not if it's confusing as that would lead to less
> robust code.
>
> For me, "it's for robust code" is sufficient hint that I now remember
> what it does (much the same way that "it's for search loops" is the
> hint I need for for...else). And when I'm trying to write robust code,
> I *do* worry about exceptions in the with expression, and the fact
> that I can't easily catch them. So for me, this is definitely a real
> (albeit one I can usually ignore) problem.
>

I'm not sure I understand the desire to catch every possible exception
right where it occurs. I've never felt this need somehow. *Apart from
open()* can someone give me an example from real code where a context
manager is likely to raise an exception that can meaningfully be handled?

I've re-read Serhiy's original example, involving a context manager for
connecting to a socket and a for-loop for reading the data from the socket
and writing it to another one (all three operations that may fail), but I
find it unconvincing. For reference I'm copying it here again:

 with connect() as stream:  # connect() or __enter__() can fail.
 for data in stream:  # __next__() can fail
 write(data)  # write() can fail

This very much looks like toy networking code to me -- real networking code
is always written in a completely different way, using different
abstractions that raise different exceptions, and I don't think it would be
written in this style even if `with` and `for` had optional `except`
clauses. (Alternatively, you can write little wrappers that turn OSError
into different exceptions so you can use a single try/except/except/except
statement to handle them all.)

-- 
--Guido van Rossum (python.org/~guido)
*Pronouns: he/him/his **(why is my pronoun here?)*

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/LPIW6CQGVCLV2WJC2ECLRMUALSW6ZTL2/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: asyncio: futures and tasks with synchronous callbacks

2019-07-30 Thread Guido van Rossum
It seems inevitable that if you use await twice you risk being cancelled in
between. The solution is to only use a single await to do all the work,
like asyncio.queue does (and why not use that for your use case?). I don't
think inventing a parallel API of synchronous callbacks is a good idea --
as you say there's a good reason why callbacks are asynchronous, and having
two subtly different ways of doing things seems more confusing than
helpful. Async I/O is already complicated -- let's not make it more so.


On Tue, Jul 30, 2019 at 2:28 AM  wrote:

> In asyncio, when a task awaits for another task (or future), it can be
> cancelled right after the awaited task finished (before the callback have
> been processed). Thus, if the awaited task has consumed data, the data is
> lost.
>
> For instance, with the following code:
>
> import asyncio
>
> available_data = []
> data_ready = asyncio.Future()
>
> def feed_data(data):
> global data_ready
> available_data.append(data)
> data_ready.set_result(None)
> data_ready = asyncio.Future()
>
> async def consume_data():
> while not available_data:
> await asyncio.shield(data_ready)
> return available_data.pop()
>
> async def wrapped_consumer():
> task = asyncio.ensure_future(consume_data())
> return await task
>
> If I perform those exact steps:
>
> async def test():
> task = asyncio.ensure_future(wrapped_consumer())
> await asyncio.sleep(0)
> feed_data('data')
> await asyncio.sleep(0)
> task.cancel()
> await asyncio.sleep(0)
> print ('task', task)
> print ('available_data', available_data)
>
> loop = asyncio.get_event_loop()
> loop.run_until_complete(test())
>
> Then I can see that the task has been cancelled despite the data being
> consumed. Since the result of `wrapped_consumer` cannot be retrieved, the
> data is forever lost.
>
> task  :17>>
> available_data []
>
> This side effect does not happen when awaiting a coroutine, but coroutine
> are not as flexible as tasks. It happens when awaiting a `Future`, a
> `Task`, or any function like `asyncio.wait`, `asyncio.wait_for` or
> `asyncio.gather` (which all inherit from or use `Future`). There is then no
> way to do anything equivalent to:
>
> stop_future = asyncio.Future()
> async def wrapped_consumer2():
> task = asyncio.ensure_future(consume_data())
> try:
> await asyncio.wait([task, stop_future])
> finally:
> task.cancel()
> await asyncio.wait([task])
> if not task.cancelled():
> return task.result()
> else:
> raise RuntimeError('stopped')
>
> This is due to the Future calling the callback asynchronously:
> https://github.com/python/cpython/blob/3.6/Lib/asyncio/futures.py#L214
>
> for callback in callbacks:
> self._loop.call_soon(callback, self)
>
> I propose to create synchronous versions of those, or a
> `synchronous_callback` parameter, that turns the callbacks of `Future`
> synchronous. I've experimented a simple librairy `syncio` with CPython 3.6
> to do this (it is harder to patch later versions due to the massive use of
> private methods).
> Basically, needs to:
> 1) replace the `Future._schedule_callbacks` method by a synchronous version
> 2) fix `Task._step` to not fail when cleaning `_current_tasks` (
> https://github.com/python/cpython/blob/3.6/Lib/asyncio/tasks.py#L245)
> 3) rewrite all the functions to use synchronous futures instead of normal
> ones
>
> With that librairy, the previous functions are possible and intuitive
>
> import syncio
>
> async def wrapped_consumer():
> task = syncio.ensure_sync_future(consume_data())
> return await task
>
> stop_future = asyncio.Future()
> async def wrapped_consumer2():
> task = syncio.ensure_sync_future(consume_data())
> try:
> await syncio.sync_wait([task, stop_future])
> finally:
> task.cancel()
> await asyncio.wait([task])
> if not task.cancelled():
> return task.result()
> else:
> raise RuntimeError('stopped')
>
> No need to use `syncio` anywhere else in the code, which makes it totally
> transparent for the end user.
>
> This "library" can be found here:
> https://github.com/aure-olli/aiokafka/blob/216b4ad3b4115bc9fa463e44fe23636bd63c5da4/syncio.py
> It implements `SyncFuture`, `SyncTask`, `ensure_sync_future`, `sync_wait`,
> `sync_wait_for`, `sync_gather` and `sync_shield`. It works with CPython 3.6
> only.
>
> To conclude:
> - asynchronous callbacks are preferrable in most case, but do not provide
> a coherent cancelled status in specific cases
> - implementing a version with synchronous callback (or a
> `synchronous_callback` parameter) is rather easy (however step 2 need to be
> clarified, probably a cleaner way to

[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Jonathan Goble
On Tue, Jul 30, 2019 at 9:32 AM Rhodri James  wrote:
> I've been trying to come up
> with something more like this:
>
>with something_exceptionable() as f:
>do_something_with(f)
>except with SomeException as e:
>handle_exception_in_setup_or_teardown(e)
>except SomeOtherException as e:
>handle_exception_in_body(e)
># Because I like this syntactic sugar too

I had to read that three times before I noticed the "with" after the "except".

Perhaps PEP 463 [1] could be resurrected on a limited basis, allowing
something like this:

with something_exceptionable() as f except SomeException ->
handle_setup_exception():
do_something_with(f)

and similar for the for statement. This can result in lengthy lines,
but it's going to be a bit awkward no matter what, and I feel a long
line is the simplest and most readable way to get past that
awkwardness. Plus, line continuations are a thing, which can make it
more readable.

[1] https://www.python.org/dev/peps/pep-0463/
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/CG77KVVOEQUT2MNN5STAIA3G76MFM4IZ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Rhodri James

On 30/07/2019 02:35, Steven D'Aprano wrote:

On Mon, Jul 29, 2019 at 03:12:21PM +0100, Rhodri James wrote:


I'm afraid I agree with Guido.  I don't imagine I will use this feature
very often, and I can see myself making that mistake every time.


Well, if you don't use this feature very often, you aren't really the
audience for the feature and your feedback should therefore be weighted
lower *wink*


I'll admit I was thinking much the same ;-)


I don't have a strong option one way or another on this feature, but I
think we should resist the trap of thinking that every feature must be
immediately and intuitively obvious to every reader.


I would actually like this feature, my problem is that it is immediately 
and obviously *misleading*.  It needs tagging somehow to stop you (well, 
me) immediately thinking "Oh, this about exceptions in the body of the 
with/for."  You suggested elsewhere giving the exception a source field, 
but that might be impossibly fine grained.  I've been trying to come up 
with something more like this:


  with something_exceptionable() as f:
  do_something_with(f)
  except with SomeException as e:
  handle_exception_in_setup_or_teardown(e)
  except SomeOtherException as e:
  handle_exception_in_body(e)
  # Because I like this syntactic sugar too

--
Rhodri James *-* Kynesim Ltd
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/KM356UXMLQYARSGP3M3EDA4DBR7QIXIE/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Cartesian Product on `__mul__`

2019-07-30 Thread Thomas Jollans
On 25/07/2019 19.41, Andrew Barnert via Python-ideas wrote:
> On Jul 25, 2019, at 09:46, Batuhan Taskaya  wrote:
>> I think it looks very fine when you type {1, 2, 3} * {"a", "b", "c"} and get 
>> set(itertools.product({1, 2, 3}, {"a", "b", "c"})). So i am proposing set 
>> multiplication implementation as cartesian product. 
> I think it might make more sense to reopen the discussion of using @ for 
> cartesian product for all containers, for a few reasons.
>
>  * The * operator means repeat on lists, tuples, and strings. While there are 
> already some operator differences between sets and other containers, but it’s 
> generally the types supporting different operators (like sets having | and &, 
> and not having +) rather than giving different meanings to the same 
> operators. Plus, tuples are frequently used for small things that are 
> conceptually sets or set-like; this is even enshrined in APIs like 
> str.endswith and isinstance.
>
>  * (Ordered) Cartesian product for lists, tuples, iterators, etc. makes just 
> as much sense even when they’re not being used as pseudo-sets, ordered sets, 
> or multisets, and might even be more common in code in practice (which is 
> presumably why there’s a function for that in the stdlib). So, why only allow 
> it for sets?
>
>  * Cartesian product feels more like matrix multiplication than like scalar 
> or elementwise multiplication or repetition, doesn’t it?

Adding @ to sets for Cartesian products *might* be reasonable, but
giving @ a meaning other than matrix multiplication for ordered
collections (lists, tuples) sounds like a terrible idea that will only
cause confusion.



>
>  * Adding @ to the builtin containers was already raised during the @ 
> discussion and tabled for the future. I believe this is because the NumPy 
> community wanted the proposal to be as minimal as possible so they could be 
> sure of getting it, and get it ASAP, not because anyone had or anticipated 
> objections beyond “is it common enough of a need to be worth it?”
>
> I don’t think this discussion would need to include other ideas deferred at 
> the same time, like @ for composition on functions. (Although @ on iterators 
> might be worth bringing up, if only to reject it. We don’t allow + between 
> arbitrary iterables defaulting to chain but meaning type(lhs)(chain(…)) on 
> some types, so why do the equivalent with @?)
> ___
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at 
> https://mail.python.org/archives/list/python-ideas@python.org/message/LR5PQMLLTBTARLK4NLUYSPIWFRTWJHLA/
> Code of Conduct: http://python.org/psf/codeofconduct/

___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/VG66TRIL2XYRDDAJKAVMVSQ3CWQH5522/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: Meta question: python-ideas HOWTO (re Re: PEP's shouldn't require a sponsor)

2019-07-30 Thread Stephen J. Turnbull
I wrote a slightly longer post similar to Guido's, and I'll not bore
you with that.  but let me emphasize what I think are the two
important points.

(1) Put it in the devguide and link from the -ideas and corementorship
list blurbs on mail.python.org.

(2) Just go FULL NIKE. do it, one draft, take advice on that, and
submit a PR to the devguide.  If people have improvements, they can do
the same.  This doesn't need bikeshedding on the wording.

Steve
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/W6JBRKH66IYHRLTJYTMZSH7TQ6APZTYV/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] asyncio: futures and tasks with synchronous callbacks

2019-07-30 Thread aurelien . lambert . 89
In asyncio, when a task awaits for another task (or future), it can be 
cancelled right after the awaited task finished (before the callback have been 
processed). Thus, if the awaited task has consumed data, the data is lost.

For instance, with the following code:

import asyncio

available_data = []
data_ready = asyncio.Future()

def feed_data(data):
global data_ready
available_data.append(data)
data_ready.set_result(None)
data_ready = asyncio.Future()

async def consume_data():
while not available_data:
await asyncio.shield(data_ready)
return available_data.pop()

async def wrapped_consumer():
task = asyncio.ensure_future(consume_data())
return await task

If I perform those exact steps:

async def test():
task = asyncio.ensure_future(wrapped_consumer())
await asyncio.sleep(0)
feed_data('data')
await asyncio.sleep(0)
task.cancel()
await asyncio.sleep(0)
print ('task', task)
print ('available_data', available_data)

loop = asyncio.get_event_loop()
loop.run_until_complete(test())

Then I can see that the task has been cancelled despite the data being 
consumed. Since the result of `wrapped_consumer` cannot be retrieved, the data 
is forever lost.

task :17>>
available_data []

This side effect does not happen when awaiting a coroutine, but coroutine are 
not as flexible as tasks. It happens when awaiting a `Future`, a `Task`, or any 
function like `asyncio.wait`, `asyncio.wait_for` or `asyncio.gather` (which all 
inherit from or use `Future`). There is then no way to do anything equivalent 
to:

stop_future = asyncio.Future()
async def wrapped_consumer2():
task = asyncio.ensure_future(consume_data())
try:
await asyncio.wait([task, stop_future])
finally:
task.cancel()
await asyncio.wait([task])
if not task.cancelled():
return task.result()
else:
raise RuntimeError('stopped')

This is due to the Future calling the callback asynchronously:
https://github.com/python/cpython/blob/3.6/Lib/asyncio/futures.py#L214

for callback in callbacks:
self._loop.call_soon(callback, self)

I propose to create synchronous versions of those, or a `synchronous_callback` 
parameter, that turns the callbacks of `Future` synchronous. I've experimented 
a simple librairy `syncio` with CPython 3.6 to do this (it is harder to patch 
later versions due to the massive use of private methods).
Basically, needs to:
1) replace the `Future._schedule_callbacks` method by a synchronous version
2) fix `Task._step` to not fail when cleaning `_current_tasks` 
(https://github.com/python/cpython/blob/3.6/Lib/asyncio/tasks.py#L245)
3) rewrite all the functions to use synchronous futures instead of normal ones

With that librairy, the previous functions are possible and intuitive

import syncio

async def wrapped_consumer():
task = syncio.ensure_sync_future(consume_data())
return await task

stop_future = asyncio.Future()
async def wrapped_consumer2():
task = syncio.ensure_sync_future(consume_data())
try:
await syncio.sync_wait([task, stop_future])
finally:
task.cancel()
await asyncio.wait([task])
if not task.cancelled():
return task.result()
else:
raise RuntimeError('stopped')

No need to use `syncio` anywhere else in the code, which makes it totally 
transparent for the end user.

This "library" can be found here: 
https://github.com/aure-olli/aiokafka/blob/216b4ad3b4115bc9fa463e44fe23636bd63c5da4/syncio.py
It implements `SyncFuture`, `SyncTask`, `ensure_sync_future`, `sync_wait`, 
`sync_wait_for`, `sync_gather` and `sync_shield`. It works with CPython 3.6 
only.

To conclude:
- asynchronous callbacks are preferrable in most case, but do not provide a 
coherent cancelled status in specific cases
- implementing a version with synchronous callback (or a `synchronous_callback` 
parameter) is rather easy (however step 2 need to be clarified, probably a 
cleaner way to fix this)
- it is totally transparent for the end user, as synchronous callback are 
totally compatible with asynchronous ones
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/E77XF3K4SRYFG3F6QLFBPP5YOB726DNS/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Re: for ... except, with ... except

2019-07-30 Thread Paul Moore
On Tue, 30 Jul 2019 at 04:00, Ethan Furman  wrote:
> If this "with...except" leads to more robust code then I think many would 
> like to use it, but not if it's confusing as that would lead to less robust 
> code.

For me, "it's for robust code" is sufficient hint that I now remember
what it does (much the same way that "it's for search loops" is the
hint I need for for...else). And when I'm trying to write robust code,
I *do* worry about exceptions in the with expression, and the fact
that I can't easily catch them. So for me, this is definitely a real
(albeit one I can usually ignore) problem.

The thing is that for most of the code I write, it's not *that*
important to be highly robust. For command line interactive utilities,
people can just fix unexpected errors and rerun the command. So
putting a lot of effort into robustness is not that important. But
then someone scripts your utility, and suddenly tracebacks are a much
bigger issue. That's why the with statement is so useful - it makes
robust exception handling low-cost to write, so my code moves a step
closer to robust-by-default. This proposal takes that a step further,
by lowering the cost of a tricky bit of boilerplate.

So I don't think it's essential, but I *do* think it would be a useful
addition to the language.

On the other hand, for...except has much less appeal, as I don't think
of for loops as blocks of code where I'd expect to be able to control
what exceptions escape (whereas in my mind that's a primary feature of
the with statement).

Paul
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/JY7RP664HXCXIEPI4LREB767TYAAG5QA/
Code of Conduct: http://python.org/psf/codeofconduct/