Re: [Python-ideas] PEP 561: Distributing Type Information V3

2017-10-10 Thread Ivan Levkivskyi
Thanks Ethan!

The PEP draft now looks good to me. I think it makes sense to make
a PoC implementation of the PEP at this point to see if everything
works smoothly in practice.

(You could also link few examples with your PoC implementation in the PEP)

--
Ivan



On 6 October 2017 at 22:00, Ethan Smith  wrote:

> Hello,
>
> I have made some changes to my PEP on distributing type information. A
> summary of the changes:
>
>- Move to adding a new metadata specifier so that more packaging tools
>can participate
>- Clarify version matching between third party stubs and runtime
>packages.
>- various other fixes for clarity, readability, and removal of
>repetition
>
> As usual I have replicated a copy below.
>
> Cheers,
> Ethan
>
>
> PEP: 561
> Title: Distributing and Packaging Type Information
> Author: Ethan Smith 
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 09-Sep-2017
> Python-Version: 3.7
> Post-History:
>
>
> Abstract
> 
>
> PEP 484 introduced type hinting to Python, with goals of making typing
> gradual and easy to adopt. Currently, typing information must be distributed
> manually. This PEP provides a standardized means to package and distribute
> type information and an ordering for type checkers to resolve modules and
> collect this information for type checking using existing packaging
> architecture.
>
>
> Rationale
> =
>
> Currently, package authors wish to distribute code that has
> inline type information. However, there is no standard method to distribute
> packages with inline type annotations or syntax that can simultaneously
> be used at runtime and in type checking. Additionally, if one wished to
> ship typing information privately the only method would be via setting
> ``MYPYPATH`` or the equivalent to manually point to stubs. If the package
> can be released publicly, it can be added to typeshed [1]_. However, this
> does not scale and becomes a burden on the maintainers of typeshed.
> Additionally, it ties bugfixes to releases of the tool using typeshed.
>
> PEP 484 has a brief section on distributing typing information. In this
> section [2]_ the PEP recommends using ``shared/typehints/pythonX.Y/`` for
> shipping stub files. However, manually adding a path to stub files for each
> third party library does not scale. The simplest approach people have taken
> is to add ``site-packages`` to their ``MYPYPATH``, but this causes type
> checkers to fail on packages that are highly dynamic (e.g. sqlalchemy
> and Django).
>
>
> Specification
> =
>
> There are several motivations and methods of supporting typing in a package.
> This PEP recognizes three (3) types of packages that may be created:
>
> 1. The package maintainer would like to add type information inline.
>
> 2. The package maintainer would like to add type information via stubs.
>
> 3. A third party would like to share stub files for a package, but the
>maintainer does not want to include them in the source of the package.
>
> This PEP aims to support these scenarios and make them simple to add to
> packaging and deployment.
>
> The two major parts of this specification are the packaging specifications
> and the resolution order for resolving module type information. The packaging
> spec is based on and extends PEP 345 metadata. The type checking spec is
> meant to replace the ``shared/typehints/pythonX.Y/`` spec of PEP 484 [2]_.
>
> New third party stub libraries are encouraged to distribute stubs via the
> third party packaging proposed in this PEP in place of being added to
> typeshed. Typeshed will remain in use, but if maintainers are found, third
> party stubs in typeshed are encouraged to be split into their own package.
>
> Packaging Type Information
> --
> In order to make packaging and distributing type information as simple and
> easy as possible, the distribution of type information, and typed Python code
> is done through existing packaging frameworks. This PEP adds a new item to the
> ``*.distinfo/METADATA`` file to contain metadata about a package's support for
> typing. The new item is optional, but must have a name of ``Typed`` and have a
> value of either ``inline`` or ``stubs``, if present.
>
> Metadata Examples::
>
> Typed: inline
> Typed: stubs
>
>
> Stub Only Packages
> ''
>
> For package maintainers wishing to ship stub files containing all of their
> type information, it is prefered that the ``*.pyi`` stubs are alongside the
> corresponding ``*.py`` files. However, the stubs may be put in a sub-folder
> of the Python sources, with the same name the ``*.py`` files are in. For
> example, the ``flyingcircus`` package would have its stubs in the folder
> ``flyingcircus/flyingcircus/``. This path is chosen so that if stubs are
> not found in ``flyingcircus/`` the type checker may treat the subdirectory as
> a normal package. The normal resolution order of checking ``*.pyi`` before

Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Nick Coghlan
On 10 October 2017 at 01:24, Guido van Rossum  wrote:

> On Sun, Oct 8, 2017 at 11:46 PM, Nick Coghlan  wrote:
>
>> On 8 October 2017 at 08:40, Koos Zevenhoven  wrote:
>>
>>> ​​I do remember Yury mentioning that the first draft of PEP 550 captured
>>> something when the generator function was called. I think I started reading
>>> the discussions after that had already been removed, so I don't know
>>> exactly what it was. But I doubt that it was *exactly* the above, because
>>> PEP 550 uses set and get operations instead of "assignment contexts" like
>>> PEP 555 (this one) does. ​​
>>>
>>
>> We didn't forget it, we just don't think it's very useful.
>>
>
> I'm not sure I agree on the usefulness. Certainly a lot of the complexity
> of PEP 550 exists just to cater to Nathaniel's desire to influence what a
> generator sees via the context of the send()/next() call. I'm still not
> sure that's worth it. In 550 v1 there's no need for chained lookups.
>

The compatibility concern is that we want developers of existing libraries
to be able to transparently switch from using thread local storage to
context local storage, and the way thread locals interact with generators
means that decimal (et al) currently use the thread local state at the time
when next() is called, *not* when the generator is created.

I like Yury's example for this, which is that the following two examples
are currently semantically equivalent, and we want to preserve that
equivalence:

with decimal.localcontext() as ctx:
ctc.prex = 30
for i in gen():
   pass

g = gen()
with decimal.localcontext() as ctx:
ctc.prex = 30
for i in g:
  pass

The easiest way to maintain that equivalence is to say that even though
preventing state changes leaking *out* of generators is considered a
desirable change, we see preventing them leaking *in* as a gratuitous
backwards compatibility break.

This does mean that *neither* form is semantically equivalent to eager
extraction of the generator values before the decimal context is changed,
but that's the status quo, and we don't have a compelling justification for
changing it.

If folks subsequently decide that they *do* want "capture on creation" or
"capture on first iteration" semantics for their generators, those are easy
enough to add as wrappers on top of the initial thread-local-compatible
base by using the same building blocks as are being added to help event
loops manage context snapshots for coroutine execution.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Koos Zevenhoven
On Tue, Oct 10, 2017 at 4:22 AM, Yury Selivanov 
wrote:

> On Mon, Oct 9, 2017 at 8:37 PM, Koos Zevenhoven  wrote:
> > You can cause unbound growth in PEP 550 too. All you have to do is nest
> an
> > unbounded number of generators.
>
> You can only nest up to 'sys.get_recursion_limit()' number of generators.
>
> With PEP 555 you can do:
>
>   while True:
> context_var.assign(42).__enter__()
>
>
​Well, in PEP 550, you can explicitly stack an unbounded number of
LogicalContexts in a while True loop. Or you can run out of memory using
plain lists even faster:

l = [42]

while True:
l *= 2 # ensure exponential blow-up

I don't see why your example with context_var.assign(42).__enter__() would
be any more likely.​

​​Sure, we could limit the number of allowed nested contexts in PEP 555. I
don't really care. Just don't enter an unbounded number of context managers
without exiting them.

​Really, it was my mistake to ever make you think that
context_var.assign(42).__enter__() can be compared to .set(42) in PEP 550.
I'll say it once more: PEP 555 context arguments have no equivalent of the
PEP-550 .set(..).


> > In PEP 555, nesting generators doesn't do
> > anything really, unless you actually assign to context arguments in the
> > generators. Only those who use it will pay.
>
> Same for 550.  If a generator doesn't set context variables, its LC
> will be an empty mapping (or NULL if you want to micro-optimize
> things).  Nodes for the chain will come from a freelist. The effective
> overhead for generators is a couple operations on pointers, and thus
> visible only in microbenchmarks.
>

​Sure, you can implement push and pop and maintain a freelist by just doing
operations on pointers. But that would be a handful of operations. Maybe
you'd even manage to avoid INCREFs and DECREFs by not exposing things as
Python objects.

But I guarantee you, PEP 555 is simpler in this regard. In (pseudo?) C, the
per-generator and per-send overhead would come from something like:

/* On generator creation */

stack = PyThreadState_Get()->carg_stack;
Py_INCREF(stack);
self->carg_stack = stack;

--

/* On each next / send */

stack_ptr = &PyThreadState_Get()->carg_stack;
if (*stack_ptr == self->carg_stack) {
/* no assignments made => do nothing */
}

/* ... then after next yield */

if (*stack_ptr == self->carg_stack) {
/* once more, do nothing */
}



And there will of course be a PyDECREF after the generator has finished or
when it is deallocated.

If the generators *do* use context argument assignments, then some stuff
would happen in the else clauses of the if statements above. (Or actually,
using != instead of ==).


> But seriously, you will always end up in a weird situation if you call an
> > unbounded number of contextmanager.__enter__() methods without calling
> > __exit__(). Nothing new about that. But entering a handful of assignment
> > contexts and leaving them open until a script ends is not the end of the
> > world. I don't think anyone should do that though.
> >
> >
> >>
> >> You'll say that it's not how the API is supposed to be used,
> >> and we say that we want to convert things like decimal and numpy to
> >> use the new mechanism.  That question was also hand-waved by you:
> >> numpy and decimal will have to come up with new/better APIs to use PEP
> >> 555.  Well, that's just not good enough.
> >
> >
> > What part of my explanation of this are you unhappy with? For instance,
> the
> > 12th (I think) email in this thread, which is my response to Nathaniel.
> > Could you reply to that and tell us your concern?
>
> I'm sorry, I'm not going to find some 12th email in some thread.  I
> stated in this thread the following: not being able to use PEP 555 to
> fix *existing* decimal & numpy APIs is not good enough.  And decimal &
> numpy is only one example, there's tons of code out there that can
> benefit from its APIs to be fixed to support for async code in Python
> 3.7.
>
>
Well, anyone interested can read that 12th email in this thread. In short,
my recommendation for libraries would be as follows:

* If the library does not provide a context manager yet, ​they should add
one, using PEP 555. That will then work nicely in coroutines and generators.

* If the library does have a context manager, implement it using PEP 555.
Or to be safe, add a new API function, so behavior in existing async code
won't change.

* If the library needs to support some kind of set_state(..) operation,
implement it by getting the state using a PEP 555 context argument and
mutating its contents.

​* Fall back to thread-local storage if no ​context argument is present or
if the Python version does not support context arguments.


​[...]


> >> > Some kind of
> >> > chained-lookup-like thing is inevitable if you want the state not to
> >> > leak
> >> > though yields out of the generator:
> >>
> >> No, it's not "inevitable".  In PEP 550 v1, generators captured the
> >> context when they are created and there was alwa

Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Nick Coghlan
On 10 October 2017 at 22:34, Koos Zevenhoven  wrote:

> Really, it was my mistake to ever make you think that
> context_var.assign(42).__enter__() can be compared to .set(42) in PEP
> 550. I'll say it once more: PEP 555 context arguments have no equivalent of
> the PEP-550 .set(..).
>

Then your alternate PEP can't work, since it won't be useful to extension
modules.

Context managers are merely syntactic sugar for try/finally statements, so
you can't wave your hands and say a context manager is the only supported
API: you *have* to break the semantics down and explain what the
try/finally equivalent looks like.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Koos Zevenhoven
On Tue, Oct 10, 2017 at 3:34 PM, Nick Coghlan  wrote:

> On 10 October 2017 at 01:24, Guido van Rossum  wrote:
>
>> On Sun, Oct 8, 2017 at 11:46 PM, Nick Coghlan  wrote:
>>
>>> On 8 October 2017 at 08:40, Koos Zevenhoven  wrote:
>>>
 ​​I do remember Yury mentioning that the first draft of PEP 550
 captured something when the generator function was called. I think I
 started reading the discussions after that had already been removed, so I
 don't know exactly what it was. But I doubt that it was *exactly* the
 above, because PEP 550 uses set and get operations instead of "assignment
 contexts" like PEP 555 (this one) does. ​​

>>>
>>> We didn't forget it, we just don't think it's very useful.
>>>
>>
>> I'm not sure I agree on the usefulness. Certainly a lot of the complexity
>> of PEP 550 exists just to cater to Nathaniel's desire to influence what a
>> generator sees via the context of the send()/next() call. I'm still not
>> sure that's worth it. In 550 v1 there's no need for chained lookups.
>>
>
> The compatibility concern is that we want developers of existing libraries
> to be able to transparently switch from using thread local storage to
> context local storage, and the way thread locals interact with generators
> means that decimal (et al) currently use the thread local state at the time
> when next() is called, *not* when the generator is created.
>

​If you want to keep those semantics in decimal, then you're already done.​



> I like Yury's example for this, which is that the following two examples
> are currently semantically equivalent, and we want to preserve that
> equivalence:
>
> with decimal.localcontext() as ctx:
> ctc.prex = 30
> for i in gen():
>pass
>
> g = gen()
> with decimal.localcontext() as ctx:
> ctc.prex = 30
> for i in g:
>   pass
>
>
>
​Generator functions aren't usually called `gen`.

Change that to​:

with decimal.localcontext() as ctx:
ctc.prex = 30
for val in values():
do_stuff_with(val)

​# and​

​vals = values()​

​with decimal.localcontext() as ctx:
ctc.prex = 30
for val in vals:
do_stuff_with(val)
​

​I see no reason why these two should be equivalent.


––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Nick Coghlan
On 10 October 2017 at 22:51, Koos Zevenhoven  wrote:

> ​I see no reason why these two should be equivalent.
>

There is no "should" about it: it's a brute fact that the two forms *are*
currently equivalent for lazy iterators (including generators), and both
different from the form that uses eager evaluation of the values before the
context change.

Where should enters into the picture is by way of PEP 550 saying that they
should *remain* equivalent because we don't have an adequately compelling
justification for changing the runtime semantics.

That is, given the following code:

itr = make_iter()
with decimal.localcontext() as ctx:
ctc.prex = 30
for i in itr:
  pass

Right now, today, in 3.6. the calculations in the iterator will use the
modified decimal context, *not* the context that applied when the iterator
was created. If you want to ensure that isn't the case, you have to force
eager evaluation before the context change.

What PEP 550 is proposing is that, by default, *nothing changes*: the lazy
iteration in the above will continue to use the updated decimal context by
default.

However, people *will* gain a new option for avoiding that: instead of
forcing eager evaluation, they'll be able to capture the creation context
instead, and switching back to that each time the iterator needs to
calculate a new value.

If PEP 555 proposes that we should instead make lazy iteration match eager
evaluation semantics by *default*, then that's going to be a much harder
case to make because it's a gratuitous compatibility break - code that
currently works one way will suddenly start doing something different, and
end users will have difficulty getting it to behave the same way on 3.7 as
it does on earlier versions.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Koos Zevenhoven
On Tue, Oct 10, 2017 at 5:01 PM, Nick Coghlan  wrote:

> On 10 October 2017 at 22:51, Koos Zevenhoven  wrote:
>
>> ​I see no reason why these two should be equivalent.
>>
>
> There is no "should" about it: it's a brute fact that the two forms *are*
> currently equivalent for lazy iterators (including generators), and both
> different from the form that uses eager evaluation of the values before the
> context change.
>
> Where should enters into the picture is by way of PEP 550 saying that they
> should *remain* equivalent because we don't have an adequately compelling
> justification for changing the runtime semantics.
>
> That is, given the following code:
>
> itr = make_iter()
> with decimal.localcontext() as ctx:
> ctc.prex = 30
> for i in itr:
>   pass
>
> Right now, today, in 3.6. the calculations in the iterator will use the
> modified decimal context, *not* the context that applied when the iterator
> was created. If you want to ensure that isn't the case, you have to force
> eager evaluation before the context change.
>
> What PEP 550 is proposing is that, by default, *nothing changes*: the lazy
> iteration in the above will continue to use the updated decimal context by
> default.
>

​That's just an arbitrary example. There are many things that *would*
change if decimal contexts simply switched from using thread-local storage
to using PEP 550. It's not at all obvious which of the changes would be
most likely to cause problems. If I were to choose, I would probably
introduce a new context manager which works with PEP 555 semantics, because
that's the only way to ensure full backwards compatibility, regardless of
whether PEP 555 or PEP 550 is used. But I'm sure one could decide otherwise.


​––Koos​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Yury Selivanov
On Tue, Oct 10, 2017 at 8:34 AM, Koos Zevenhoven  wrote:
> On Tue, Oct 10, 2017 at 4:22 AM, Yury Selivanov 
> wrote:
>>
>> On Mon, Oct 9, 2017 at 8:37 PM, Koos Zevenhoven  wrote:
>> > You can cause unbound growth in PEP 550 too. All you have to do is nest
>> > an
>> > unbounded number of generators.
>>
>> You can only nest up to 'sys.get_recursion_limit()' number of generators.
>>
>> With PEP 555 you can do:
>>
>>   while True:
>> context_var.assign(42).__enter__()
>>
>
> Well, in PEP 550, you can explicitly stack an unbounded number of
> LogicalContexts in a while True loop.

No, you can't. PEP 550 doesn't have APIs to "stack ... LogicalContexts".

> Or you can run out of memory using
> plain lists even faster:
>
> l = [42]
>
> while True:
> l *= 2 # ensure exponential blow-up
>
> I don't see why your example with context_var.assign(42).__enter__() would
> be any more likely.

Of course you can write broken code. The point is that contexts work
like scopes/mappings, and it's counter-intuitive that setting a
variable with 'cv.assign(..).__enter__()' will break the world.  If a
naive user tries to convert their existing decimal-like API to use
your PEP, everything would work initially, but then blow up in
production.

[..]
> Really, it was my mistake to ever make you think that
> context_var.assign(42).__enter__() can be compared to .set(42) in PEP 550.
> I'll say it once more: PEP 555 context arguments have no equivalent of the
> PEP-550 .set(..).

Any API exposing a context manager should have an alternative
try..finally API.  In your case it's
'context_var.assign(42).__enter__()'.  'With' statements are sugar in
Python.  It's unprecedented to design API solely around them.

>
>>
>> > In PEP 555, nesting generators doesn't do
>> > anything really, unless you actually assign to context arguments in the
>> > generators. Only those who use it will pay.
>>
>> Same for 550.  If a generator doesn't set context variables, its LC
>> will be an empty mapping (or NULL if you want to micro-optimize
>> things).  Nodes for the chain will come from a freelist. The effective
>> overhead for generators is a couple operations on pointers, and thus
>> visible only in microbenchmarks.
>
>
> Sure, you can implement push and pop and maintain a freelist by just doing
> operations on pointers. But that would be a handful of operations. Maybe
> you'd even manage to avoid INCREFs and DECREFs by not exposing things as
> Python objects.
>
> But I guarantee you, PEP 555 is simpler in this regard.
[..]

I wrote several implementations of PEP 550 so far. No matter what you
put in genobject.send(): one pointer op or two, the results are the
same: in microbenchmarks generators become 1-2% slower. In
macrobenchmarks of generators you can't observe any slowdown.  And if
we want the fastest possible context implementation, we can chose PEP
550 v1, which is the simplest solution.  In any case, the performance
argument is invalid, please stop using it.

>> > But seriously, you will always end up in a weird situation if you call
>> > an
>> > unbounded number of contextmanager.__enter__() methods without calling
>> > __exit__(). Nothing new about that. But entering a handful of assignment
>> > contexts and leaving them open until a script ends is not the end of the
>> > world. I don't think anyone should do that though.
>> >
>> >
>> >>
>> >> You'll say that it's not how the API is supposed to be used,
>> >> and we say that we want to convert things like decimal and numpy to
>> >> use the new mechanism.  That question was also hand-waved by you:
>> >> numpy and decimal will have to come up with new/better APIs to use PEP
>> >> 555.  Well, that's just not good enough.
>> >
>> >
>> > What part of my explanation of this are you unhappy with? For instance,
>> > the
>> > 12th (I think) email in this thread, which is my response to Nathaniel.
>> > Could you reply to that and tell us your concern?
>>
>> I'm sorry, I'm not going to find some 12th email in some thread.  I
>> stated in this thread the following: not being able to use PEP 555 to
>> fix *existing* decimal & numpy APIs is not good enough.  And decimal &
>> numpy is only one example, there's tons of code out there that can
>> benefit from its APIs to be fixed to support for async code in Python
>> 3.7.
>>
>
> Well, anyone interested can read that 12th email in this thread. In short,
> my recommendation for libraries would be as follows:
>
> * If the library does not provide a context manager yet, they should add
> one, using PEP 555. That will then work nicely in coroutines and generators.
>
> * If the library does have a context manager, implement it using PEP 555. Or
> to be safe, add a new API function, so behavior in existing async code won't
> change.
>
> * If the library needs to support some kind of set_state(..) operation,
> implement it by getting the state using a PEP 555 context argument and
> mutating its contents.
>
> * Fall back to thread-local storage if no context arg

Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Yury Selivanov
On Tue, Oct 10, 2017 at 10:22 AM, Koos Zevenhoven  wrote:
> On Tue, Oct 10, 2017 at 5:01 PM, Nick Coghlan  wrote:
>>
>> On 10 October 2017 at 22:51, Koos Zevenhoven  wrote:
>>>
>>> I see no reason why these two should be equivalent.
>>
>>
>> There is no "should" about it: it's a brute fact that the two forms *are*
>> currently equivalent for lazy iterators (including generators), and both
>> different from the form that uses eager evaluation of the values before the
>> context change.
>>
>> Where should enters into the picture is by way of PEP 550 saying that they
>> should *remain* equivalent because we don't have an adequately compelling
>> justification for changing the runtime semantics.
>>
>> That is, given the following code:
>>
>> itr = make_iter()
>> with decimal.localcontext() as ctx:
>> ctc.prex = 30
>> for i in itr:
>>   pass
>>
>> Right now, today, in 3.6. the calculations in the iterator will use the
>> modified decimal context, *not* the context that applied when the iterator
>> was created. If you want to ensure that isn't the case, you have to force
>> eager evaluation before the context change.
>>
>> What PEP 550 is proposing is that, by default, *nothing changes*: the lazy
>> iteration in the above will continue to use the updated decimal context by
>> default.
>
>
> That's just an arbitrary example. There are many things that *would* change
> if decimal contexts simply switched from using thread-local storage to using
> PEP 550. It's not at all obvious which of the changes would be most likely
> to cause problems. If I were to choose, I would probably introduce a new
> context manager which works with PEP 555 semantics, because that's the only
> way to ensure full backwards compatibility, regardless of whether PEP 555 or
> PEP 550 is used. But I'm sure one could decide otherwise.

Please stop using "many things .. would", "most likely" etc.  We have
a very focused discussion here.  If you know of any particular issue,
please demonstrate it with a realistic example.  Otherwise, we only
increase the number of emails and make things harder to track for
everybody.

If decimal switches to use PEP 550, there will be no "many things that
*would* change".  The only thing that will change is this:

  def g():
with decimal_context(...):
   yield

  next(g())   # this will no longer leak decimal context to the outer world

I consider the above a bug fix, because nobody in their right mind
relies on partial iteration of generator expecting that some of it's
internal code would affect your code indirectly.  The only such case
is contextlib.contextmanager, and PEP 550 provides mechanisms to make
generators "leaky" explicitly.

Yury
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Koos Zevenhoven
On Tue, Oct 10, 2017 at 5:40 PM, Yury Selivanov 
wrote:

> On Tue, Oct 10, 2017 at 8:34 AM, Koos Zevenhoven 
> wrote:
> > On Tue, Oct 10, 2017 at 4:22 AM, Yury Selivanov  >
> > wrote:
> >>
> >> On Mon, Oct 9, 2017 at 8:37 PM, Koos Zevenhoven 
> wrote:
> >> > You can cause unbound growth in PEP 550 too. All you have to do is
> nest
> >> > an
> >> > unbounded number of generators.
> >>
> >> You can only nest up to 'sys.get_recursion_limit()' number of
> generators.
> >>
> >> With PEP 555 you can do:
> >>
> >>   while True:
> >> context_var.assign(42).__enter__()
> >>
> >
> > Well, in PEP 550, you can explicitly stack an unbounded number of
> > LogicalContexts in a while True loop.
>
> No, you can't. PEP 550 doesn't have APIs to "stack ... LogicalContexts".
>
>
​​
​That's ridiculous. Quoting PEP 550: "​
The contextvars.run_with_logical_context(lc: LogicalContext, func, *args,
**kwargs) function, which runs func with the provided logical context on
top of the current execution context.
​"​


> > Or you can run out of memory using
> > plain lists even faster:
> >
> > l = [42]
> >
> > while True:
> > l *= 2 # ensure exponential blow-up
> >
> > I don't see why your example with context_var.assign(42).__enter__()
> would
> > be any more likely.
>
> Of course you can write broken code. The point is that contexts work
> like scopes/mappings, and it's counter-intuitive that setting a
> variable with 'cv.assign(..).__enter__()' will break the world.  If a
> naive user tries to convert their existing decimal-like API to use
> your PEP, everything would work initially, but then blow up in
> production.
>
>
​The docs will tell them what to do. You can pass a context argument down
the call chain. You don't "set" context arguments!​ That's why I'm changing
to "context argument", and I've said this many times now.



> [..]
> > Really, it was my mistake to ever make you think that
> > context_var.assign(42).__enter__() can be compared to .set(42) in PEP
> 550.
> > I'll say it once more: PEP 555 context arguments have no equivalent of
> the
> > PEP-550 .set(..).
>
> Any API exposing a context manager should have an alternative
> try..finally API.  In your case it's
> 'context_var.assign(42).__enter__()'.  'With' statements are sugar in
> Python.  It's unprecedented to design API solely around them.
>
> ​[..]
>

​Yury writes:​


>> >> That question was also hand-waved by you:
> >> >> numpy and decimal will have to come up with new/better APIs to use
> PEP
> >> >> 555.  Well, that's just not good enough.
> >> >
> >> >
>

​Koos writes:​


>> > What part of my explanation of this are you unhappy with? For instance,
> >> > the
> >> > 12th (I think) email in this thread, which is my response to
> Nathaniel.
> >> > Could you reply to that and tell us your concern?
> >>
> >> I'm sorry, I'm not going to find some 12th email in some thread.  I
> >> stated in this thread the following: not being able to use PEP 555 to
> >> fix *existing* decimal & numpy APIs is not good enough.  And decimal &
> >> numpy is only one example, there's tons of code out there that can
> >> benefit from its APIs to be fixed to support for async code in Python
> >> 3.7.
> >>
> >
> > Well, anyone interested can read that 12th email in this thread. In
> short,
> > my recommendation for libraries would be as follows:
> >
> > * If the library does not provide a context manager yet, they should add
> > one, using PEP 555. That will then work nicely in coroutines and
> generators.
> >
> > * If the library does have a context manager, implement it using PEP
> 555. Or
> > to be safe, add a new API function, so behavior in existing async code
> won't
> > change.
> >
> > * If the library needs to support some kind of set_state(..) operation,
> > implement it by getting the state using a PEP 555 context argument and
> > mutating its contents.
> >
> > * Fall back to thread-local storage if no context argument is present or
> if
> > the Python version does not support context arguments.
>
> The last bullet point is the problem.  Everybody is saying to you that
> it's not acceptable.  It's your choice to ignore that.
>
>
​Never has anyone told me that that is not acceptable. Please stop that.

[..]
> >> What do you mean by "just sweep it under the carpet"?  Capturing the
> >> context at the moment of generators creation is a design choice with
> >> some consequences (that I illustrated in my previous email).  There
> >> are cons and pros of doing that.
> >>
> >
> > "Capturing the context at generator creation" and "isolating generators
> > completely" are two different things.
> >
> > I've described pros of the former. The latter has no pros that I'm aware
> of,
> > except if sweeping things under the carpet is considered as one.
> >
> > Yes, the latter works in some use cases, but in others it does not. For
> > instance, if an async framework wants to make some information available
> > throughout the async task. If you isolate generators, then async
> programmers

Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Koos Zevenhoven
On Tue, Oct 10, 2017 at 5:46 PM, Yury Selivanov 
wrote:

> On Tue, Oct 10, 2017 at 10:22 AM, Koos Zevenhoven 
> wrote:
> > On Tue, Oct 10, 2017 at 5:01 PM, Nick Coghlan 
> wrote:
> >>
> >> On 10 October 2017 at 22:51, Koos Zevenhoven  wrote:
> >>>
> >>> I see no reason why these two should be equivalent.
> >>
> >>
> >> There is no "should" about it: it's a brute fact that the two forms
> *are*
> >> currently equivalent for lazy iterators (including generators), and both
> >> different from the form that uses eager evaluation of the values before
> the
> >> context change.
> >>
> >> Where should enters into the picture is by way of PEP 550 saying that
> they
> >> should *remain* equivalent because we don't have an adequately
> compelling
> >> justification for changing the runtime semantics.
> >>
> >> That is, given the following code:
> >>
> >> itr = make_iter()
> >> with decimal.localcontext() as ctx:
> >> ctc.prex = 30
> >> for i in itr:
> >>   pass
> >>
> >> Right now, today, in 3.6. the calculations in the iterator will use the
> >> modified decimal context, *not* the context that applied when the
> iterator
> >> was created. If you want to ensure that isn't the case, you have to
> force
> >> eager evaluation before the context change.
> >>
>

​It is not obvious to me if changing the semantics of this is breakage or a
bug fix (as you put it below).​



> >> What PEP 550 is proposing is that, by default, *nothing changes*: the
> lazy
> >> iteration in the above will continue to use the updated decimal context
> by
> >> default.
> >
> >
> > That's just an arbitrary example. There are many things that *would*
> change
> > if decimal contexts simply switched from using thread-local storage to
> using
> > PEP 550. It's not at all obvious which of the changes would be most
> likely
> > to cause problems. If I were to choose, I would probably introduce a new
> > context manager which works with PEP 555 semantics, because that's the
> only
> > way to ensure full backwards compatibility, regardless of whether PEP
> 555 or
> > PEP 550 is used. But I'm sure one could decide otherwise.
>
> Please stop using "many things .. would", "most likely" etc.


​I can't explain everything, especially not in a single email.​ I will use
whatever English words I need. You can also think for yourself––or ask a
question.



> We have
> a very focused discussion here.  If you know of any particular issue,
> please demonstrate it with a realistic example.  Otherwise, we only
> increase the number of emails and make things harder to track for
> everybody.
>
>
I'm not going to (and won't be able to) list all those many use cases. I'd
like to keep this more focused too. I'm sure you are well aware of those
differences. It's not up to me to decide what `decimal` should do.

I'll give you some examples below, if that helps.



> If decimal switches to use PEP 550, there will be no "many things that
> *would* change".  The only thing that will change is this:
>
>   def g():
> with decimal_context(...):
>yield
>
>   next(g())   # this will no longer leak decimal context to the outer world
>
>
You forgot `yield from g()​`. See also below.



> I consider the above a bug fix, because nobody in their right mind
> relies on partial iteration of generator expecting that some of it's
> internal code would affect your code indirectly.


​People use generators for all kinds of things.​ See below.


> The only such case
> is contextlib.contextmanager, and PEP 550 provides mechanisms to make
> generators "leaky" explicitly.
>
>
​That's not the only one.

​Here's another example:​

​def context_switcher():
for c in contexts:
decimal.setcontext(c)
yield
​
ctx_switcher = context_switcher()

def next_context():
next(ctx_switcher)



And one more example:


def make_things():
old_ctx = None
def first_things_first():
first = compute_first_value()
yield first

ctx = figure_out_context(first)
nonlocal old_ctx
old_ctx = decimal.getcontext()
decimal.setcontext(ctx)

yield get_second_value()

def the_bulk_of_things():
return get_bulk()

def last_but_not_least():
decimal.set_context(old_ctx)
yield "LAST"


yield from first_things_first()
yield from the_bulk_of_things()
yield from last_but_not_least()

all_things = list(make_things())


​––Koos​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Yury Selivanov
On Tue, Oct 10, 2017 at 11:26 AM, Koos Zevenhoven  wrote:
> On Tue, Oct 10, 2017 at 5:40 PM, Yury Selivanov 
> wrote:
>>
>> On Tue, Oct 10, 2017 at 8:34 AM, Koos Zevenhoven 
>> wrote:
>> > On Tue, Oct 10, 2017 at 4:22 AM, Yury Selivanov
>> > 
>> > wrote:
>> >>
>> >> On Mon, Oct 9, 2017 at 8:37 PM, Koos Zevenhoven 
>> >> wrote:
>> >> > You can cause unbound growth in PEP 550 too. All you have to do is
>> >> > nest
>> >> > an
>> >> > unbounded number of generators.
>> >>
>> >> You can only nest up to 'sys.get_recursion_limit()' number of
>> >> generators.
>> >>
>> >> With PEP 555 you can do:
>> >>
>> >>   while True:
>> >> context_var.assign(42).__enter__()
>> >>
>> >
>> > Well, in PEP 550, you can explicitly stack an unbounded number of
>> > LogicalContexts in a while True loop.
>>
>> No, you can't. PEP 550 doesn't have APIs to "stack ... LogicalContexts".
>>
>
> That's ridiculous. Quoting PEP 550: "
> The contextvars.run_with_logical_context(lc: LogicalContext, func, *args,
> **kwargs) function, which runs func with the provided logical context on top
> of the current execution context.

Note that 'run_with_logical_context()' doesn't accept the EC.  It gets
it using the 'get_execution_context()' function, which will squash LCs
if needed.

I say it again: *by design*, PEP 550 APIs do not allow to manually
stack LCs in such a way that an unbound growth of the stack is
possible.

> "
>
>>
>> > Or you can run out of memory using
>> > plain lists even faster:
>> >
>> > l = [42]
>> >
>> > while True:
>> > l *= 2 # ensure exponential blow-up
>> >
>> > I don't see why your example with context_var.assign(42).__enter__()
>> > would
>> > be any more likely.
>>
>> Of course you can write broken code. The point is that contexts work
>> like scopes/mappings, and it's counter-intuitive that setting a
>> variable with 'cv.assign(..).__enter__()' will break the world.  If a
>> naive user tries to convert their existing decimal-like API to use
>> your PEP, everything would work initially, but then blow up in
>> production.
>>
>
> The docs will tell them what to do. You can pass a context argument down the
> call chain. You don't "set" context arguments! That's why I'm changing to
> "context argument", and I've said this many times now.

I'm saying this the last time: In Python, any context manager should
have an equivalent try..finally form.  Please give us an example, how
we can use PEP 555 APIs with a try..finally block.

By the way, PEP 555 has this, quote:

"""
By default, values assigned inside a generator do not leak through
yields to the code that drives the generator. However, the assignment
contexts entered and left open inside the generator do become
visible outside the generator after the generator has finished with
a StopIteration or another exception:

assi = cvar.assign(new_value)
def genfunc():
yield
assi.__enter__():
yield
"""

Why do you call __enter__() manually in this example?  I thought it's
a strictly prohibited thing in your PEP -- it's unsafe to use it this
way.

Is it only for illustration purposes?  If so, then how "the assignment
contexts entered and left open inside the generator" can even be a
thing in your design?


[..]
>> > * Fall back to thread-local storage if no context argument is present or
>> > if
>> > the Python version does not support context arguments.
>>
>> The last bullet point is the problem.  Everybody is saying to you that
>> it's not acceptable.  It's your choice to ignore that.
>>
>
> Never has anyone told me that that is not acceptable. Please stop that.

The whole idea of PEP 550 was to provide a working alternative to TLS.
So this is clearly not acceptable for PEP 550.

PEP 555 may hand-wave this requirement, but it simply limits the scope
of where it can be useful.  Which in my opinion means that it provides
strictly *less* functionality than PEP 550.

[..]
>> This is plain incorrect. Please read PEP 550v1 before continuing the
>> discussion about it.
>>
>
> I thought you wrote that they are isolated both ways. Maybe there's a
> misunderstanding. I found your "New PEP 550" email in the archives in some
> thread.

PEP 550 has links to all versions of it. You can simply read it there.

> That might be v1, but the figure supposedly explaining this part is
> missing. Whatever. This is not about PEP 550v1 anyway.

This is about you spreading wrong information about PEP 550 (all of
its versions in this case).

Again, in PEP 550:

1. Changes to contexts made in async generators and sync generators do
not leak to the caller.  Changes made in a caller are visible to the
generator.

2. Changes to contexts made in async tasks do not leak to the outer
code or other tasks.  That's assuming async tasks implementation is
tweaked to use 'run_with_execution_context'.  Otherwise, coroutines
work with EC just like functions.

3. Changes to contexts made in OS threads do not leak to other threads.

How's PEP 555 different besides requiring to use a context manager?

>
>>

Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Yury Selivanov
On Tue, Oct 10, 2017 at 12:21 PM, Koos Zevenhoven  wrote:
[..]
>> Please stop using "many things .. would", "most likely" etc.
>
>
> I can't explain everything, especially not in a single email. I will use
> whatever English words I need. You can also think for yourself––or ask a
> question.

I can't assign meaning to your examples formulated in "many things"
and "most likely".  I can reason about concrete words and code
examples.  You essentially asking us to *trust you* that you know of
some examples and they exist.  It's not going to happen.



>
>
>>
>> We have
>> a very focused discussion here.  If you know of any particular issue,
>> please demonstrate it with a realistic example.  Otherwise, we only
>> increase the number of emails and make things harder to track for
>> everybody.
>>
>
> I'm not going to (and won't be able to) list all those many use cases.

Then why are you working on a PEP? :)

[..]
>> The only such case
>> is contextlib.contextmanager, and PEP 550 provides mechanisms to make
>> generators "leaky" explicitly.
>>
>
> That's not the only one.
>
> Here's another example:
>
> def context_switcher():
> for c in contexts:
> decimal.setcontext(c)
> yield
> ctx_switcher = context_switcher()
>
> def next_context():
> next(ctx_switcher)

In 10 years of me professionally writing Python code, I've never seen
this pattern in any code base.  But even if such pattern exists, you
can simply decorate "context_switcher" generator to set it's
__logical_context__ to None. And it will start to leak things.

BTW, how does PEP 555 handle your own example?  I thought it's not
possible to implement "decimal.setcontext" with PEP 555 at all!


>
>
>
> And one more example:
>
>
> def make_things():
> old_ctx = None
> def first_things_first():
> first = compute_first_value()
> yield first
>
> ctx = figure_out_context(first)
> nonlocal old_ctx
> old_ctx = decimal.getcontext()
> decimal.setcontext(ctx)
>
> yield get_second_value()
>
> def the_bulk_of_things():
> return get_bulk()
>
> def last_but_not_least():
> decimal.set_context(old_ctx)
> yield "LAST"
>
>
> yield from first_things_first()
> yield from the_bulk_of_things()
> yield from last_but_not_least()
>
> all_things = list(make_things())


I can only say that this one wouldn't pass my code review :)  This
isn't a real example, this is something that you clearly just a piece
of tangled convoluted code that you just invented.

Yury
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Koos Zevenhoven
On Tue, Oct 10, 2017 at 3:42 PM, Nick Coghlan  wrote:

> On 10 October 2017 at 22:34, Koos Zevenhoven  wrote:
>
>> Really, it was my mistake to ever make you think that
>> context_var.assign(42).__enter__() can be compared to .set(42) in PEP
>> 550. I'll say it once more: PEP 555 context arguments have no equivalent of
>> the PEP-550 .set(..).
>>
>
> Then your alternate PEP can't work, since it won't be useful to extension
> modules.
>
>
​Maybe this helps:

* PEP 550 is based on var.set(..), but you will then implement context
managers on top of that.

* PEP 555 is based context managers, but you can implement a var.set(..)​
on top of that if you really need it.



> Context managers are merely syntactic sugar for try/finally statements, so
> you can't wave your hands and say a context manager is the only supported
> API: you *have* to break the semantics down and explain what the
> try/finally equivalent looks like.
>
>
>
Is this what you're asking?​

​assi = cvar.assign(value)
assi.__enter__()
try:
# do stuff involving cvar.value
finally:
assi.__exit__()


As written in the PEP, these functions would have C equivalents. But most C
extensions will probably only need cvar.value, and the assignment contexts
will be entered from Python.

​––Koos​


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Yury Selivanov
On Tue, Oct 10, 2017 at 12:34 PM, Koos Zevenhoven  wrote:
> On Tue, Oct 10, 2017 at 3:42 PM, Nick Coghlan  wrote:
[..]
>> Context managers are merely syntactic sugar for try/finally statements, so
>> you can't wave your hands and say a context manager is the only supported
>> API: you *have* to break the semantics down and explain what the try/finally
>> equivalent looks like.
>>
>>
>
> Is this what you're asking?
>
> assi = cvar.assign(value)
> assi.__enter__()
> try:
> # do stuff involving cvar.value
> finally:
> assi.__exit__()

But then you *are* allowing users to use "__enter__()" and
"__exit__()" directly. Which means that some users *can* experience an
unbound growth of context values stack that will make their code run
out of memory.

This is not similar to appending something to a list -- people are
aware that lists can't grow infinitely.  But it's not obvious that you
can't call "cvar.assign(value).__enter__()" many times.

The problem with memory leaks like this is that you can easily write
some code and ship it. And only after a while you start experiencing
problems in production that are extremely hard to track.

Yury
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Guido van Rossum
On Tue, Oct 10, 2017 at 5:34 AM, Nick Coghlan  wrote:

> On 10 October 2017 at 01:24, Guido van Rossum  wrote:
>
>> On Sun, Oct 8, 2017 at 11:46 PM, Nick Coghlan  wrote:
>>
>>> On 8 October 2017 at 08:40, Koos Zevenhoven  wrote:
>>>
 ​​I do remember Yury mentioning that the first draft of PEP 550
 captured something when the generator function was called. I think I
 started reading the discussions after that had already been removed, so I
 don't know exactly what it was. But I doubt that it was *exactly* the
 above, because PEP 550 uses set and get operations instead of "assignment
 contexts" like PEP 555 (this one) does. ​​

>>>
>>> We didn't forget it, we just don't think it's very useful.
>>>
>>
>> I'm not sure I agree on the usefulness. Certainly a lot of the complexity
>> of PEP 550 exists just to cater to Nathaniel's desire to influence what a
>> generator sees via the context of the send()/next() call. I'm still not
>> sure that's worth it. In 550 v1 there's no need for chained lookups.
>>
>
> The compatibility concern is that we want developers of existing libraries
> to be able to transparently switch from using thread local storage to
> context local storage, and the way thread locals interact with generators
> means that decimal (et al) currently use the thread local state at the time
> when next() is called, *not* when the generator is created.
>

Apart from the example in PEP 550, is that really a known idiom?


> I like Yury's example for this, which is that the following two examples
> are currently semantically equivalent, and we want to preserve that
> equivalence:
>
> with decimal.localcontext() as ctx:
> ctc.prex = 30
> for i in gen():
>pass
>
> g = gen()
> with decimal.localcontext() as ctx:
> ctc.prex = 30
> for i in g:
>   pass
>

Do we really want that equivalence? It goes against the equivalence from
Koos' example.


> The easiest way to maintain that equivalence is to say that even though
> preventing state changes leaking *out* of generators is considered a
> desirable change, we see preventing them leaking *in* as a gratuitous
> backwards compatibility break.
>

I dunno, I think them leaking in in the first place is a dubious feature,
and I'm not too excited that the design of the way forward should bend over
backwards to be compatible here.

The only real use case I've seen so far (not counting examples that just
show how it works) is Nathaniel's timeout example (see point 9 in Nathaniel’s
message
),
and I'm still not convinced that that example is important enough to
support either.

It would all be easier to decide if there were use cases that were less
far-fetched, or if the far-fetched use cases would be supportable with a
small tweak. As it is, it seems that we could live in a simpler, happier
world if we gave up on context values leaking in via next() etc. (I still
claim that in that case we wouldn't need chained lookup in the exposed
semantics, just fast copying of contexts.)


> This does mean that *neither* form is semantically equivalent to eager
> extraction of the generator values before the decimal context is changed,
> but that's the status quo, and we don't have a compelling justification for
> changing it.
>

I think the justification is that we could have a *significantly* simpler
semantics and implementation.


> If folks subsequently decide that they *do* want "capture on creation" or
> "capture on first iteration" semantics for their generators, those are easy
> enough to add as wrappers on top of the initial thread-local-compatible
> base by using the same building blocks as are being added to help event
> loops manage context snapshots for coroutine execution.
>

(BTW Capture on first iteration sounds just awful.)

I think we really need to do more soul-searching before we decide that a
much more complex semantics and implementation is worth it to maintain
backwards compatibility for leaking in via next().

-- 
--Guido van Rossum (python.org/~guido)
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP draft: context variables

2017-10-10 Thread Steve Dower
Nick: “I like Yury's example for this, which is that the following two examples 
are currently semantically equivalent, and we want to preserve that equivalence:

    with decimal.localcontext() as ctx:
    ctc.prex = 30
        for i in gen():
           pass

    g = gen()
    with decimal.localcontext() as ctx:
    ctc.prex = 30
        for i in g:
          pass”

I’m following this discussion from a distance, but cared enough about this 
point to chime in without even reading what comes later in the thread. 
(Hopefully it’s not twenty people making the same point…)

I HATE this example! Looking solely at the code we can see, you are refactoring 
a function call from inside an *explicit* context manager to outside of it, and 
assuming the behavior will not change. There’s *absolutely no* logical or 
semantic reason that these should be equivalent, especially given the obvious 
alternative of leaving the call within the explicit context. Even moving the 
function call before the setattr can’t be assumed to not change its behavior – 
how is moving it outside a with block ever supposed to be safe?

I appreciate the desire to be able to take currently working code using one 
construct and have it continue working with a different construct, but the 
burden should be on that library and not the runtime. By that I mean that the 
parts of decimal that set and read the context should do the extra work to 
maintain compatibility (e.g. through a globally mutable structure using context 
variables as a slightly more fine-grained key than thread ID) rather than 
forcing an otherwise straightforward core runtime feature to jump through hoops 
to accommodate it.

New users of this functionality very likely won’t assume that TLS is the 
semantic equivalent, especially when all the examples and naming make it sound 
like context managers are more related. (I predict people will expect this to 
behave more like unstated/implicit function arguments and be captured at the 
same time as other arguments are, but can’t really back that up except with 
gut-feel. It's certainly a feature that I want for myself more than I want 
another spelling for TLS…)

Top-posted from my Windows phone

From: Nick Coghlan
Sent: Tuesday, October 10, 2017 5:35
To: Guido van Rossum
Cc: Python-Ideas
Subject: Re: [Python-ideas] PEP draft: context variables

On 10 October 2017 at 01:24, Guido van Rossum  wrote:
On Sun, Oct 8, 2017 at 11:46 PM, Nick Coghlan  wrote:
On 8 October 2017 at 08:40, Koos Zevenhoven  wrote:
​​I do remember Yury mentioning that the first draft of PEP 550 captured 
something when the generator function was called. I think I started reading the 
discussions after that had already been removed, so I don't know exactly what 
it was. But I doubt that it was *exactly* the above, because PEP 550 uses set 
and get operations instead of "assignment contexts" like PEP 555 (this one) 
does. ​​

We didn't forget it, we just don't think it's very useful.


I'm not sure I agree on the usefulness. Certainly a lot of the complexity of 
PEP 550 exists just to cater to Nathaniel's desire to influence what a 
generator sees via the context of the send()/next() call. I'm still not sure 
that's worth it. In 550 v1 there's no need for chained lookups.

The compatibility concern is that we want developers of existing libraries to 
be able to transparently switch from using thread local storage to context 
local storage, and the way thread locals interact with generators means that 
decimal (et al) currently use the thread local state at the time when next() is 
called, *not* when the generator is created.

I like Yury's example for this, which is that the following two examples are 
currently semantically equivalent, and we want to preserve that equivalence:

    with decimal.localcontext() as ctx:
    ctc.prex = 30
        for i in gen():
           pass

    g = gen()
    with decimal.localcontext() as ctx:
    ctc.prex = 30
        for i in g:
          pass

The easiest way to maintain that equivalence is to say that even though 
preventing state changes leaking *out* of generators is considered a desirable 
change, we see preventing them leaking *in* as a gratuitous backwards 
compatibility break.

This does mean that *neither* form is semantically equivalent to eager 
extraction of the generator values before the decimal context is changed, but 
that's the status quo, and we don't have a compelling justification for 
changing it.

If folks subsequently decide that they *do* want "capture on creation" or 
"capture on first iteration" semantics for their generators, those are easy 
enough to add as wrappers on top of the initial thread-local-compatible base by 
using the same building blocks as are being added to help event loops manage 
context snapshots for coroutine execution.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia

___
Python-ideas ma