Re: [Python-ideas] Make map() better

2017-09-13 Thread Steven D'Aprano

On Wed, Sep 13, 2017 at 11:05:26PM +0200, Jason H wrote:

> > And look, map() even works with all of them, without inheritance,
> > registration, and whatnot. It's so easy!
> 
> Define easy. 

Opposite of hard or difficult.

You want to map a function?

map(function, values)

is all it takes. You don't have to care whether the collection of values 
supports the map() method, or whether the class calls it "apply", or 
"Map", or something else. All you need care about is that the individual 
items inside the iterable are valid for the function, but you would need 
to do that regardless of how you call it.

[1, 2, 3, {}, 5].map(plusone)  # will fail


> It's far easier for me to do a dir(dict) and see what I can do with it.

And what of the functions that dict doesn't know about?



>  This is what python does after all. "Does it have the interface I 
> expect?" Global functions like len(), min(), max(), map(), etc(), 
> don't really tell me the full story. len(7) makes no sense. I can 
> attempt to call a function with an invalid argument.

And you can attempt to call a non-existent method:

x = 7
x.len()

Or should that be length() or size() or count() or what?

> [].len() makes more sense.

Why? Getting the length of a sequence or iterator is not specifically a 
list operation, it is a generic operation that can apply to many 
different kinds of things.


> Python is weird in that there are these special magical 
> globals 

The word you want is "function".

> that operate on many things.

What makes that weird? Even Javascript has functions. So do C, Pascal, 
Haskell, C++, Lisp, Scheme, and thousands of other languages.

> Why is it ','.join(iterable), why 
> isn't there join(',', iterable) At what point does a method become a 
> global? A member? Do we take the path that everything is a global? Or 
> should all methods be members? So far it seems arbitrary.

Okay, its arbitrary.

Why is it called [].len instead of [].length or {}.size? Why None 
instead of nil or null or nul or NULL or NOTHING?

Many decisions in programming languages are arbitrary. 



-- 
Steve
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Make map() better

2017-09-13 Thread Steven D'Aprano
On Wed, Sep 13, 2017 at 05:09:37PM +0200, Jason H wrote:

> The format of map seems off. Coming from JS, all the functions come 
> second. I think this approach is superior.

Obviously Javascript has got it wrong. map() applies the function to 
the given values, so the function comes first. That matches normal 
English word order:

* map function to values
* apply polish to boots   # not "apply boots from polish"
* spread butter on bread  # not "spread bread under butter"

Its hard to even write the opposite order in English:

map() takes the values and has the function applied to them

which is a completely unnatural way of speaking or thinking about it 
(in English).

I suggest you approach the Javascript developers and ask them to change 
the way they call map() to suit the way Python does it. After all, 
Python is the more popular language, and it is older too.


> Also, how are we to tell what supports map()?

Um... is this a trick question? Any function that takes at least one 
argument is usable with map().


> Any iterable should be able to map via:
> range(26).map(lambda x: chr(ord('a')+x)))

No, that would be silly. That means that every single iterable class is 
responsible for re-implementing map, instead of having a single 
implementation, namely the map() function.



-- 
Steve
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Lukasz Langa

> On Sep 13, 2017, at 9:44 PM, Nick Coghlan  wrote:
> 
> On 14 September 2017 at 09:43, Lukasz Langa  wrote:
>>> On Sep 13, 2017, at 6:37 PM, Nick Coghlan  wrote:
>>> That way, during the "from __future__ import lazy_annotations" period,
>>> folks will have clearer guidance on how to explicitly opt-in to eager
>>> evaluation via function and class decorators.
>> 
>> I like this idea! For classes it would have to be a function that you call 
>> post factum. The way class decorators are implemented, they cannot evaluate 
>> annotations that contain forward references. For example:
>> 
>> class Tree:
>>left: Tree
>>right: Tree
>> 
>>def __init__(self, left: Tree, right: Tree):
>>self.left = left
>>self.right = right
>> 
>> This is true today, get_type_hints() called from within a class decorator 
>> will fail on this class. However, a function performing postponed evaluation 
>> can do this without issue. If a class decorator knew what name a class is 
>> about to get, that would help. But that's a different PEP and I'm not 
>> writing that one ;-)
> 
> The class decorator case is indeed a bit more complicated, but there
> are a few tricks available to create a forward-reference friendly
> evaluation environment.

Using cls.__name__ and the ChainMap is clever, I like it. It might prove useful 
for Eric's data classes later. However, there's more to forward references than 
self-references:

class A:
b: B

class B:
...

In this scenario evaluation of A's annotations has to happen after the module 
is fully loaded. This is the general case. No magic decorator will solve this.

The general solution is running eval() later, when the namespace is fully 
populated. I do agree with you that a default implementation of a 
typing-agnostic variant of `get_type_hints()` would be nice. If anything, 
implementing this might better surface limitations of postponed annotations. 
That function won't be recursive though as your example. And I'll leave 
converting the function to a decorator as an exercise for the reader, 
especially given the forward referencing caveats.

- Ł



signature.asc
Description: Message signed with OpenPGP
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Nick Coghlan
On 14 September 2017 at 09:43, Lukasz Langa  wrote:
>> On Sep 13, 2017, at 6:37 PM, Nick Coghlan  wrote:
>> That way, during the "from __future__ import lazy_annotations" period,
>> folks will have clearer guidance on how to explicitly opt-in to eager
>> evaluation via function and class decorators.
>
> I like this idea! For classes it would have to be a function that you call 
> post factum. The way class decorators are implemented, they cannot evaluate 
> annotations that contain forward references. For example:
>
> class Tree:
> left: Tree
> right: Tree
>
> def __init__(self, left: Tree, right: Tree):
> self.left = left
> self.right = right
>
> This is true today, get_type_hints() called from within a class decorator 
> will fail on this class. However, a function performing postponed evaluation 
> can do this without issue. If a class decorator knew what name a class is 
> about to get, that would help. But that's a different PEP and I'm not writing 
> that one ;-)

The class decorator case is indeed a bit more complicated, but there
are a few tricks available to create a forward-reference friendly
evaluation environment.

1. To get the right globals namespace, you can do:

global_ns = sys.modules[cls.__module__].__dict__

2. Define the evaluation locals as follows:

local_ns = collections.ChainMap({cls.__name__: cls}, cls.__dict__)

3. Evaluate the variable and method annotations using "eval(expr,
global_ns, local_ns)"

If you make the eager annotation evaluation recursive (so the
decorator can be applied to the outermost class, but also affects all
inner class definitions), then it would even be sufficient to allow
nested classes to refer to both the outer class as well as other inner
classes (regardless of definition order).

To prevent inadvertent eager evaluation of annotations on functions
and classes that are merely referenced from a class attribute, the
recursive descent would need to be conditional on "attr.__qualname__
== cls.__qualname__ + '.' + attr.__name__".

So something like:

def eager_class_annotations(cls):
global_ns = sys.modules[cls.__module__].__dict__
local_ns = collections.ChainMap({cls.__name__: cls}, cls.__dict__)
annotations = cls.__annotations__
for k, v in annotations.items():
annotations[k] = eval(v, global_ns, local_ns)
for attr in cls.__dict__.values():
name = getattr(attr, "__name__", None)
if name is None:
continue
qualname = getattr(attr, "__qualname__", None)
if qualname is None:
continue
if qualname != f"{cls.__qualname}.{name}":
continue
if isinstance(attr, type):
eager_class_annotations(attr)
else:
eager_annotations(attr)
return cls

You could also hide the difference between eager annotation evaluation
on a class or a function inside a single decorator:

def eager_annotations(obj):
if isinstance(obj, type):
_eval_class_annotations(obj) # Class
elif hasattr(obj, "__globals__"):
_eval_annotations(obj, obj.__globals__) # Function
else:
_eval_annotations(obj, obj.__dict__) # Module
return obj

Given the complexity of the class decorator variant, I now think it
would actually make sense for the PEP to propose *providing* these
decorators somewhere in the standard library (the lower level "types"
module seems like a reasonable candidate, but we've historically
avoided having that depend on the full collections module)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Lukasz Langa

> On Sep 13, 2017, at 6:37 PM, Nick Coghlan  wrote:
> 
> I think it would be useful for the PEP to include a definition of an
> "eager annotations" decorator that did something like:
> 
>def eager_annotations(f):
>ns = f.__globals__
>annotations = f.__annotations__
>for k, v in annotations.items():
>annotations[k] = eval(v, ns)
>return f
> 
> And pointed out that you can create variants of that which also pass
> in the locals() namespace (or use sys._getframes() to access it
> dynamically).
> 
> That way, during the "from __future__ import lazy_annotations" period,
> folks will have clearer guidance on how to explicitly opt-in to eager
> evaluation via function and class decorators.

I like this idea! For classes it would have to be a function that you call post 
factum. The way class decorators are implemented, they cannot evaluate 
annotations that contain forward references. For example:

class Tree:
left: Tree
right: Tree

def __init__(self, left: Tree, right: Tree):
self.left = left
self.right = right

This is true today, get_type_hints() called from within a class decorator will 
fail on this class. However, a function performing postponed evaluation can do 
this without issue. If a class decorator knew what name a class is about to 
get, that would help. But that's a different PEP and I'm not writing that one 
;-)

- Ł


signature.asc
Description: Message signed with OpenPGP
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] sys.py

2017-09-13 Thread Guido van Rossum
My preference is (1), revert. You have to understand how sys.modules works
before you can change it, and if you waste time debugging your mistake, so
be it. I think the sys.py proposal is the wrong way to fix the mistake you
made there.

On Wed, Sep 13, 2017 at 4:00 PM, Eric Snow 
wrote:

> On Tue, Sep 12, 2017 at 9:30 PM, Guido van Rossum 
> wrote:
> > I find this a disturbing trend.
>
> Which trend?  Moving away from "consenting adults"?  In the case of
> sys.modules, the problem is that assigning a bogus value (e.g. []) can
> cause the interpreter to crash.  It wasn't a problem until recently
> when I removed PyInterpreterState.modules and made sys.modules
> authoritative (see https://bugs.python.org/issue28411).  The options
> there are:
>
> 1. revert that change (which means assigning to sys.modules
> deceptively does nothing)
> 2. raise an exception in all the places that expect sys.modules to be
> a mapping (far from where sys.modules was re-assigned)
> 3. raise an exception if you try to set sys.modules to a non-mapping
> 4. let a bogus sys.modules break the interpreter (basically, tell
> people "don't do that")
>
> My preference is #3 (obviously), but it sounds like you'd rather not.
>
> > I think we have bigger fish to fry and this sounds like it could slow
> down startup.
>
> It should have little impact on startup.  The difference is the cost
> of importing the new sys module (which we could easily freeze to
> reduce the cost).  That cost would apply only to programs that
> currently import sys.  Everything in the stdlib would be updated to
> use _sys directly.
>
> If you think it isn't worth it then I'll let it go.  I brought it up
> because I consider it a cheap, practical solution to the problem I ran
> into.  Thanks!
>
> -eric
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] sys.py

2017-09-13 Thread Eric Snow
On Tue, Sep 12, 2017 at 9:30 PM, Guido van Rossum  wrote:
> I find this a disturbing trend.

Which trend?  Moving away from "consenting adults"?  In the case of
sys.modules, the problem is that assigning a bogus value (e.g. []) can
cause the interpreter to crash.  It wasn't a problem until recently
when I removed PyInterpreterState.modules and made sys.modules
authoritative (see https://bugs.python.org/issue28411).  The options
there are:

1. revert that change (which means assigning to sys.modules
deceptively does nothing)
2. raise an exception in all the places that expect sys.modules to be
a mapping (far from where sys.modules was re-assigned)
3. raise an exception if you try to set sys.modules to a non-mapping
4. let a bogus sys.modules break the interpreter (basically, tell
people "don't do that")

My preference is #3 (obviously), but it sounds like you'd rather not.

> I think we have bigger fish to fry and this sounds like it could slow down 
> startup.

It should have little impact on startup.  The difference is the cost
of importing the new sys module (which we could easily freeze to
reduce the cost).  That cost would apply only to programs that
currently import sys.  Everything in the stdlib would be updated to
use _sys directly.

If you think it isn't worth it then I'll let it go.  I brought it up
because I consider it a cheap, practical solution to the problem I ran
into.  Thanks!

-eric
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 554: Stdlib Module to Support Multiple Interpreters in Python Code

2017-09-13 Thread Nick Coghlan
On 13 September 2017 at 14:10, Nathaniel Smith  wrote:
> Subinterpreters are basically an attempt to reimplement the OS's
> process isolation in user-space, right?

Not really, they're more an attempt to make something resembling
Rust's memory model available to Python programs - having the default
behaviour be "memory is not shared", but having the choice to share
when you want to be entirely an application level decision, without
getting into the kind of complexity needed to deliberately break
operating system level process isolation.

The difference is that where Rust was able to do that on a per-thread
basis and rely on their borrow checker for enforcement of memory
ownership, for PEP 554, we're proposing to do it on a per-interpreter
basis, and rely on runtime object space partitioning (where Python
objects and the memory allocators are *not* shared between
interpreters) to keep things separated from each other.

That's why memoryview is such a key part of making the proposal
interesting: it's what lets us relatively easily poke holes in the
object level partitioning between interpreters and provide zero-copy
messaging passing without having to share any regular reference counts
between interpreters (which in turn is what makes it plausible that we
may eventually be able to switch to a true GIL-per-interpreter model,
with only a few cross-interpreter locks for operations like accessing
the list of interpreters itself).

Right now, the closest equivalent to this programming model that
Python offers is to combine threads with queue.Queue, and it requires
a lot of programming discipline to ensure that you don't access an
object again once you've submitted to a queue.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Nick Coghlan
On 14 September 2017 at 06:01, Jim J. Jewett  wrote:
> The "right time" is whenever they are currently evaluated.
> (Definition time, I think, but won't swear.)
>
> For example, the "annotation" might really be a call to a logger,
> showing the current environment, including names that will be rebound
> before the module finishes loading.
>
> I'm perfectly willing to agree that even needing this much control
> over timing is a code smell, but it is currently possible, and I would
> rather it not become impossible.
>
> At a minimum, it seems like "just run this typing function that you
> should already be using" should either save the right context, or the
> PEP should state explicitly that this functionality is being
> withdrawn.  (And go ahead and suggest a workaround, such as running
> the code before the method definition, or as a decorator.)

I think it would be useful for the PEP to include a definition of an
"eager annotations" decorator that did something like:

def eager_annotations(f):
ns = f.__globals__
annotations = f.__annotations__
for k, v in annotations.items():
annotations[k] = eval(v, ns)
return f

And pointed out that you can create variants of that which also pass
in the locals() namespace (or use sys._getframes() to access it
dynamically).

That way, during the "from __future__ import lazy_annotations" period,
folks will have clearer guidance on how to explicitly opt-in to eager
evaluation via function and class decorators.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 554: Stdlib Module to Support Multiple Interpreters in Python Code

2017-09-13 Thread Nick Coghlan
On 13 September 2017 at 20:45, Koos Zevenhoven  wrote:
> On Wed, Sep 13, 2017 at 6:14 AM, Nick Coghlan  wrote:
>>
>> On 13 September 2017 at 00:35, Koos Zevenhoven  wrote:>
>>
>> > I don't see how the situation benefits from calling something the "main
>> > interpreter". Subinterpreters can be a way to take something
>> > non-thread-safe
>> > and make it thread-safe, because in an interpreter-per-thread scheme,
>> > most
>> > of the state, like module globals, are thread-local. (Well, this doesn't
>> > help for async concurrency, but anyway.)
>>
>> "The interpreter that runs __main__" is never going to go away as a
>> concept for the regular CPython CLI.
>
>
> It's still just *an* interpreter that happens to run __main__. And who says
> it even needs to be the only one?

Koos, I've asked multiple times now for you to describe the practical
user benefits you believe will come from dispensing with the existing
notion of a main interpreter (which is *not* something PEP 554 has
created - the main interpreter already exists at the implementation
level, PEP 554 just makes that fact visible at the Python level).

If you can't come up with a meaningful user benefit that would arise
from removing it, then please just let the matter drop.

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 561 v2 - Packaging Static Type Information

2017-09-13 Thread Nick Coghlan
On 13 September 2017 at 14:33, Ethan Smith  wrote:
> On Tue, Sep 12, 2017 at 8:30 PM, Nick Coghlan  wrote:
>> There are a lot of packaging tools in use other than distutils, so I
>> don't think the distutils update proposal belongs in the PEP. Rather,
>> the PEP should focus on defining how type analysers should search for
>> typing information, and then updating packaging tools to help with
>> that can be treated as separate RFEs for each of the publishing tools
>> that people use (perhaps with a related task-oriented guide on
>> packaging.python.org)
>
>
> I think this makes a lot of sense. Would a description of the package
> metadata being queried suffice to be generic enough?

It would - a spec to say "Typecheckers should look for  to learn
" and "Publishers should provide  to tell typecheckers ".

PEP 376 is the current definition of the installed package metadata,
so if you describe this idea in terms of *.dist-info/METADATA entries,
then folks will be able to translate that to the wheel archive format
and the various legacy install db formats.

>> I'm not clear on how this actually differs from the existing search
>> protocol in PEP 484, since step 3 is exactly what the
>> 'shared/typehints/pythonX.Y' directory is intended to cover.
>>
>> Is it just a matter allowing the use of "-stubs" as the typehint
>> installation directory, since installing under a different package
>> name is easier to manage using existing publishing tools than
>> installing to a different target directory?
>
> Perhaps I could be clearer in the PEP text on this. The idea is that people
> can ship normal sdists (or what have you) and install those to the package
> installation directory. Then the type checkers would pick up `pkg-stub` when
> looking for `pkg` type information via the package API. This allows a third
> party to ship just *.pyi files in a package and install it as if it were the
> runtime package, but still be picked up by type checkers. This is different
> than using 'shared/typehints/pythonX.Y' because that directory cannot be
> queried by package resource APIs, and since no type checker implements PEP
> 484's method, I thought it would be better to have everything be unified
> under the same system of installing packages. So I suppose that is a rather
> long, yes. :)

OK, it wasn't clear to me that none of the current typecheckers
actually implement looking for extra stubs in
'shared/typehints/pythonX.Y' .

In that case, it makes a lot of sense to me to try to lower barriers
to adoption by switching to a scheme that's more consistent with the
way Python packaging and installation tools already work, and a simple
suffix-based shadow tree approach makes a lot of sense to me from the
packaging perspective (I'll leave it to the folks actually working on
mypy et al to say how the feel about this more decentralised approach
to managing 3rd party stubs).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Make map() better

2017-09-13 Thread Jason H


> Sent: Wednesday, September 13, 2017 at 3:57 PM
> From: "Stefan Behnel" 
> To: python-ideas@python.org
> Subject: Re: [Python-ideas] Make map() better
>
> Jason H schrieb am 13.09.2017 um 17:54:
> > I'm rather surprised that there isn't a Iterable class which dict and list 
> > derive from.
> > If that were added to just dict and list, I think it would cover 98% of 
> > cases, and adding Iterable would be reasonable in the remaining scenarios.
> 
> Would you then always have to inherit from that class in order to make a
> type iterable? That would be fairly annoying...
> 
> The iterable and iterator protocols are extremely simple, and that's a
> great feature.
> 
> And look, map() even works with all of them, without inheritance,
> registration, and whatnot. It's so easy!

Define easy. 

It's far easier for me to do a dir(dict) and see what I can do with it. This is 
what python does after all. "Does it have the interface I expect?"
Global functions like len(), min(), max(), map(), etc(), don't really tell me 
the full story. len(7) makes no sense. I can attempt to call a function with an 
invalid argument. [].len() makes more sense. Python is weird in that there are 
these special magical globals that operate on many things. Why is it 
','.join(iterable), why isn't there join(',', iterable) At what point does a 
method become a global? A member? Do we take the path that everything is a 
global? Or should all methods be members? So far it seems arbitrary.


___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] [Python-Dev] A reminder for PEP owners

2017-09-13 Thread Skip Montanaro
> But someone has to
> review and accept all those PEPs, and I can't do it all by myself.

An alternate definition for BDFL is "Benevolent Delegator For Life." :-)

Skip
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Jelle Zijlstra
2017-09-13 13:01 GMT-07:00 Jim J. Jewett :

> On Wed, Sep 13, 2017 at 3:12 PM, Lukasz Langa  wrote:
> > On Sep 13, 2017, at 2:56 PM, Jim J. Jewett  wrote:
>
> >> I am generally supportive of leaving the type annotations
> >> unprocessed by default, but there are use cases where
> >> they should be processed (and even cases where doing it
> >> at the right time matters, because of a side effect).
>
> > What is the "right time" you're speaking of?
>
> The "right time" is whenever they are currently evaluated.
> (Definition time, I think, but won't swear.)
>
> For example, the "annotation" might really be a call to a logger,
> showing the current environment, including names that will be rebound
> before the module finishes loading.
>
> I'm perfectly willing to agree that even needing this much control
> over timing is a code smell, but it is currently possible, and I would
> rather it not become impossible.
>

Is this just a theoretical concern? Unless there is significant real-world
code doing this sort of thing, I don't see much of a problem in deprecating
such code using the normal __future__-based deprecation cycle.


>
> At a minimum, it seems like "just run this typing function that you
> should already be using" should either save the right context, or the
> PEP should state explicitly that this functionality is being
> withdrawn.  (And go ahead and suggest a workaround, such as running
> the code before the method definition, or as a decorator.)
>
>
> >> (1)  The PEP suggests opting out with @typing.no_type_hints ...
>
> > This is already possible. PEP 484 specifies that
>
> > "A # type: ignore comment on a line by itself is equivalent to adding an
> > inline # type: ignore to each line until the end of the current indented
> > block. At top indentation level this has effect of disabling type
> checking
> > until the end of file."
>
> Great!  Please mention this as well as (or perhaps instead of)
> typing.no_type_check.
>
>
> >> It would be a bit messy (like the old coding cookie),
> >> but recognizing a module-wide
>
> >> # typing.no_type_hints
>
> >> comment and then falling back to the current behavior
> >> would be enough for me.
>
> > Do you know of any other per-module feature toggle of this kind?
>
> No, thus the comment about it being messy.  But it does offer one way
> to ensure that annotations are evaluated within the proper
> environment, even without having to save those environments.
>
> -jJ
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] LOAD_NAME/LOAD_GLOBAL should be use getattr()

2017-09-13 Thread Lucas Wiman
On Wed, Sep 13, 2017 at 11:55 AM, Serhiy Storchaka 
wrote:

> [...] Calling __getattr__() will slow down the access to builtins. And
> there is a recursion problem if module's __getattr__() uses builtins.
>

 The first point is totally valid, but the recursion problem doesn't seem
like a strong argument. There are already lots of recursion problems when
defining custom __getattr__ or __getattribute__ methods, but on balance
they're a very useful part of the language.

- Lucas
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Jim J. Jewett
On Wed, Sep 13, 2017 at 3:12 PM, Lukasz Langa  wrote:
> On Sep 13, 2017, at 2:56 PM, Jim J. Jewett  wrote:

>> I am generally supportive of leaving the type annotations
>> unprocessed by default, but there are use cases where
>> they should be processed (and even cases where doing it
>> at the right time matters, because of a side effect).

> What is the "right time" you're speaking of?

The "right time" is whenever they are currently evaluated.
(Definition time, I think, but won't swear.)

For example, the "annotation" might really be a call to a logger,
showing the current environment, including names that will be rebound
before the module finishes loading.

I'm perfectly willing to agree that even needing this much control
over timing is a code smell, but it is currently possible, and I would
rather it not become impossible.

At a minimum, it seems like "just run this typing function that you
should already be using" should either save the right context, or the
PEP should state explicitly that this functionality is being
withdrawn.  (And go ahead and suggest a workaround, such as running
the code before the method definition, or as a decorator.)


>> (1)  The PEP suggests opting out with @typing.no_type_hints ...

> This is already possible. PEP 484 specifies that

> "A # type: ignore comment on a line by itself is equivalent to adding an
> inline # type: ignore to each line until the end of the current indented
> block. At top indentation level this has effect of disabling type checking
> until the end of file."

Great!  Please mention this as well as (or perhaps instead of)
typing.no_type_check.


>> It would be a bit messy (like the old coding cookie),
>> but recognizing a module-wide

>> # typing.no_type_hints

>> comment and then falling back to the current behavior
>> would be enough for me.

> Do you know of any other per-module feature toggle of this kind?

No, thus the comment about it being messy.  But it does offer one way
to ensure that annotations are evaluated within the proper
environment, even without having to save those environments.

-jJ
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Make map() better

2017-09-13 Thread Stefan Behnel
Jason H schrieb am 13.09.2017 um 17:54:
> I'm rather surprised that there isn't a Iterable class which dict and list 
> derive from.
> If that were added to just dict and list, I think it would cover 98% of 
> cases, and adding Iterable would be reasonable in the remaining scenarios.

Would you then always have to inherit from that class in order to make a
type iterable? That would be fairly annoying...

The iterable and iterator protocols are extremely simple, and that's a
great feature.

And look, map() even works with all of them, without inheritance,
registration, and whatnot. It's so easy!

Stefan

___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Lukasz Langa

> On Sep 13, 2017, at 2:56 PM, Jim J. Jewett  wrote:
> 
> I am generally supportive of leaving the type annotations unprocessed
> by default, but there are use cases where they should be processed
> (and even cases where doing it at the right time matters, because of a
> side effect).

What is the "right time" you're speaking of?


> (1)  The PEP suggests opting out with @typing.no_type_hints ... The
> closest I could find was @typing.no_type_check, which has to be called
> on each object.

This was a typo on my part. Yes, no_type_check is what I meant.


> It should be possible to opt out for an entire module, and it should
> be possible to do so *without* first importing typing.
> 
> Telling type checkers to ignore scopes (including modules) with a
> 
> # typing.no_type_hints
> 
> comment would be sufficient for me.

This is already possible. PEP 484 specifies that

"A # type: ignore comment on a line by itself is equivalent to adding an inline 
# type: ignore to each line until the end of the current indented block. At top 
indentation level this has effect of disabling type checking until the end of 
file."


> (2)  Getting the annotations processed (preferably at the currently
> correct time) should also be possible on a module-wide basis, and
> should also not require importing the entire typing apparatus.

Again, what is the "correct time" you're speaking of?


> It would be a bit messy (like the old coding cookie), but recognizing
> a module-wide
> 
> # typing.no_type_hints
> 
> comment and then falling back to the current behavior would be enough for me.

Do you know of any other per-module feature toggle of this kind? __future__ 
imports are not feature toggles, they are timed deprecations.

Finally, the non-typing use cases that you're worried about, what are they? 
From the research I've done, none of the actual use cases in existence would be 
rendered impossible by postponed evaluation. So far the concerns about side 
effects and local scope in annotations aren't supported by any strong evidence 
that this change would be disruptive.

Don't get me wrong, I'm not being dismissive. I just don't think it's 
reasonable to get blocked on potential and obscure use cases that no real world 
code actually employs.

- Ł



signature.asc
Description: Message signed with OpenPGP
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] A reminder for PEP owners

2017-09-13 Thread Guido van Rossum
I know there's a lot of excitement around lots of new ideas. And the 3.7
feature freeze is looming (January is in a few months). But someone has to
review and accept all those PEPs, and I can't do it all by myself.

If you want your proposal to be taken seriously, you need to include a
summary of the discussion on the mailing list (including objections, even
if you disagree!) in your PEP, e.g. as an extended design rationale or
under the Rejected Ideas heading.

If you don't do this you risk having to repeat yourself -- also you risk
having your PEP rejected, because at this point there's no way I am going
to read all the discussions.

-- 
--Guido van Rossum (python.org/~guido )
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Guido van Rossum
On Wed, Sep 13, 2017 at 11:56 AM, Jim J. Jewett 
wrote:

> It should be possible to opt out for an entire module, and it should
> be possible to do so *without* first importing typing.
>

PEP 484 has a notation for this -- put

  # type: ignore

at the top of your file and the file won't be type-checked. (Before you
test this, mypy doesn't yet support this. But it could.)

IIUC functions and classes will still have an __annotations__ attribute
(except when it would be empty) so even with the __future__ import (or in
Python 4.0) you could still make non-standard use of annotations pretty
easily -- you'd just get a string rather than an object. (And a simple
eval() will turn the string into an object -- the PEP has a lot of extra
caution because currently the evaluation happens in the scope where the
annotation is encountered, but if you don't care about that everything's
easy.)

-- 
--Guido van Rossum (python.org/~guido)
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] PEP 563 and expensive backwards compatibility

2017-09-13 Thread Jim J. Jewett
I am generally supportive of leaving the type annotations unprocessed
by default, but there are use cases where they should be processed
(and even cases where doing it at the right time matters, because of a
side effect).

I am concerned that the backwards compatibility story for non-typing
cases be not just possible, but reasonable.

(1)  The PEP suggests opting out with @typing.no_type_hints ... The
closest I could find was @typing.no_type_check, which has to be called
on each object.

It should be possible to opt out for an entire module, and it should
be possible to do so *without* first importing typing.

Telling type checkers to ignore scopes (including modules) with a

# typing.no_type_hints

comment would be sufficient for me.

If that isn't possible, please at least create a nontyping or
minityping module so that the marker can be imported without the full
overhead of the typing module.

(2)  Getting the annotations processed (preferably at the currently
correct time) should also be possible on a module-wide basis, and
should also not require importing the entire typing apparatus.

It would be a bit messy (like the old coding cookie), but recognizing
a module-wide

# typing.no_type_hints

comment and then falling back to the current behavior would be enough for me.

Alternatively, it would be acceptable to run something like
typing.get_type_hints, if that could be done in a single pass at the
end of the module (callable from both within the module and from
outside) ... but again, such a cleanup function should be in a smaller
module that doesn't require loading all of typing.

-jJ
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] LOAD_NAME/LOAD_GLOBAL should be use getattr()

2017-09-13 Thread Serhiy Storchaka

12.09.17 19:17, Neil Schemenauer пише:

This is my idea of making module properties work.  It is necessary
for various lazy-loading module ideas and it cleans up the language
IMHO.  I think it may be possible to do it with minimal backwards
compatibility problems and performance regression.

To me, the main issue with module properties (or module __getattr__)
is that you introduce another level of indirection on global
variable access.  Anywhere the module.__dict__ is used as the
globals for code execution, changing LOAD_NAME/LOAD_GLOBAL to have
another level of indirection is necessary.  That seems inescapable.

Introducing another special feature of modules to make this work is
not the solution, IMHO.  We should make module namespaces be more
like instance namespaces.  We already have a mechanism and it is
getattr on objects.


There is a difference between module namespaces and instance namespaces. 
LOAD_NAME/LOAD_GLOBAL fall back to builtins if the name is not found in 
the globals dictionary. Calling __getattr__() will slow down the access 
to builtins. And there is a recursion problem if module's __getattr__() 
uses builtins.


___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Make map() better

2017-09-13 Thread Nick Timkovich
On Wed, Sep 13, 2017 at 10:54 AM, Jason H  wrote:
>
> Thanks for the insights.
> I don't think it would be that breaking:
>
> def remap_map(a1, a2):
>   if hasattr(a1, '__call__'):
> return map(a1, a2)
>   elif hasattr(a2, '__call__'):
> return map(a2,a1)
>   else:
> raise NotCallable # Exception neither is callable
>

I think it's better to be parsimonious and adhere to the "there is one way
to do it" design principle. On the matter of style, map with a lambda is
more pleasing as `(expr-x for x in iterable)` rather than `map(lambda x:
expr-x, iterable)`. If you need to handle multiple iterables, they can be
zip'd.


> I'm rather surprised that there isn't a Iterable class which dict and list
> derive from.
> If that were added to just dict and list, I think it would cover 98% of
> cases, and adding Iterable would be reasonable in the remaining scenarios.


For checking, there's `collections.abc.Iterable` and neighbors that can
look at the interface easily, but I don't think the C-implemented, built-in
types spring from them.

Nick
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Make map() better

2017-09-13 Thread Jason H


> Sent: Wednesday, September 13, 2017 at 11:23 AM
> From: "Edward Minnix" 
> To: "Jason H" 
> Cc: Python-Ideas 
> Subject: Re: [Python-ideas] Make map() better
>
> While I agree that the method calling syntax is nicer, I disagree with 
> flipping the argument error for three main reasons.
> 
> First: it violates the signature entirely
> The signature to map is map(function, *iterables). Python’s map is more like 
> Haskell’s zipWith. Making the function last would either ruin the signature 
> or would slow down performance.
> 
> Second: currying
> If you ever curry a function in Python using functools.partial, having the 
> most common arguments first is crucial. (You’re more likely to apply the same 
> function to multiple iterables than to apply several functions on the same 
> exact iterable).
> 
> Thirdly: the change would make several functional programming packages have 
> incompatible APIs.
> Currently libraries like PyToolz/Cytoolz and funcy have APIs that require 
> function-first argument order. Changing the argument order would be 
> disruptive to most Python FP packages/frameworks.
> 
> So while I agree with you that “iterable.map(fn)” is more readable, I think 
> changing the argument order would be too much of a breaking change, and there 
> is no practical way to add “iterable.map(fn)” to every iterable type.


Thanks for the insights.
I don't think it would be that breaking:

def remap_map(a1, a2):
  if hasattr(a1, '__call__'):
return map(a1, a2)
  elif hasattr(a2, '__call__'):
return map(a2,a1)
  else:
raise NotCallable # Exception neither is callable


I'm rather surprised that there isn't a Iterable class which dict and list 
derive from.
If that were added to just dict and list, I think it would cover 98% of cases, 
and adding Iterable would be reasonable in the remaining scenarios.


___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Make map() better

2017-09-13 Thread Edward Minnix
While I agree that the method calling syntax is nicer, I disagree with flipping 
the argument error for three main reasons.

First: it violates the signature entirely
The signature to map is map(function, *iterables). Python’s map is more like 
Haskell’s zipWith. Making the function last would either ruin the signature or 
would slow down performance.

Second: currying
If you ever curry a function in Python using functools.partial, having the most 
common arguments first is crucial. (You’re more likely to apply the same 
function to multiple iterables than to apply several functions on the same 
exact iterable).

Thirdly: the change would make several functional programming packages have 
incompatible APIs.
Currently libraries like PyToolz/Cytoolz and funcy have APIs that require 
function-first argument order. Changing the argument order would be disruptive 
to most Python FP packages/frameworks.

So while I agree with you that “iterable.map(fn)” is more readable, I think 
changing the argument order would be too much of a breaking change, and there 
is no practical way to add “iterable.map(fn)” to every iterable type.

- Ed

> On Sep 13, 2017, at 11:09, Jason H  wrote:
> 
> The format of map seems off. Coming from JS, all the functions come second. I 
> think this approach is superior.
> 
> Currently:
> map(lambda x: chr(ord('a')+x), range(26)) # ['a', 'b', 'c', 'd', 'e', 'f', 
> 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 
> 'v', 'w', 'x', 'y', 'z']
> 
> But I think this reads better:
> map(range(26), lambda x: chr(ord('a')+x))
> 
> Currently that results in:
> TypeError: argument 2 to map() must support iteration
> 
> Also, how are we to tell what supports map()?
> Any iterable should be able to map via:
> range(26).map(lambda x: chr(ord('a')+x)))
> 
> While the line length is the same, I think the latter is much more readable, 
> and the member method avoids parameter order confusion
> 
> For the global map(),
> having the iterable first also increases reliability because the lambda 
> function is highly variable in length, where as parameter names are generally 
> shorter than even the longest lambda expression.
> 
> More readable: IMHO:
> map(in, lambda x: chr(ord('a')+x))
> out = map(out, lambda x: chr(ord('a')+x))
> out = map(out, lambda x: chr(ord('a')+x))
> 
> Less readable (I have to parse the lambda):
> map(lambda x: chr(ord('a')+x), in)
> out = map(lambda x: chr(ord('a')+x), out)
> out = map(lambda x: chr(ord('a')+x), out)
> 
> But I contend:
> range(26).map(lambda x: chr(ord('a')+x)))
> is superior to all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/

___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Make map() better

2017-09-13 Thread Jason H
The format of map seems off. Coming from JS, all the functions come second. I 
think this approach is superior.

Currently:
map(lambda x: chr(ord('a')+x), range(26)) # ['a', 'b', 'c', 'd', 'e', 'f', 'g', 
'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 
'x', 'y', 'z']

But I think this reads better:
map(range(26), lambda x: chr(ord('a')+x))

Currently that results in:
TypeError: argument 2 to map() must support iteration

Also, how are we to tell what supports map()?
Any iterable should be able to map via:
range(26).map(lambda x: chr(ord('a')+x)))

While the line length is the same, I think the latter is much more readable, 
and the member method avoids parameter order confusion

For the global map(),
having the iterable first also increases reliability because the lambda 
function is highly variable in length, where as parameter names are generally 
shorter than even the longest lambda expression.

More readable: IMHO:
map(in, lambda x: chr(ord('a')+x))
out = map(out, lambda x: chr(ord('a')+x))
out = map(out, lambda x: chr(ord('a')+x))

Less readable (I have to parse the lambda):
map(lambda x: chr(ord('a')+x), in)
out = map(lambda x: chr(ord('a')+x), out)
out = map(lambda x: chr(ord('a')+x), out)

But I contend:
range(26).map(lambda x: chr(ord('a')+x)))
is superior to all.

 











___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] Hexadecimal floating literals

2017-09-13 Thread Thibault Hilaire
Hi everybody

> I chose it because it's easy to write. Maybe math.pi is a better example :-)
>> 
> math.pi.hex()
>> '0x1.921fb54442d18p+1'
> 
> 3.141592653589793 is four fewer characters to type, just as accurate, 
> and far more recognizable.

Of course, for a lost of numbers, the decimal representation is simpler, and 
just as accurate as the radix-2 hexadecimal representation.
But, due to the radix-10 and radix-2 used in the two representations, the 
radix-2 may be much easier to use.
In the "Handbook of Floating-Point Arithmetic" (JM Muller et al, Birkhauser 
editor, page 40),the authors claims that the largest exact decimal 
representation of a double-precision floating-point requires 767 digits !!
So it is not always few characters to type to be just as accurate !!
For example (this is the largest exact decimal representation of a 
single-precision 32-bit float):
> 1.1754942106924410754870294448492873488270524287458985717453057158887047561890426550235133618116378784179687e-38
and
> 0x1.fc000p-127
are exactly the same number (one in decimal representation, the other in 
radix-2 hexadecimal)!


So, we have several alternatives:
- do nothing, and continue to use float.hex() and float.fromhex()
- support one of some of the following possibilities:

   a) support the hexadecimal floating-point literals, like released in C++17 
(I don't know if some other languages already support this)
   >>> x = 0x1.2492492492492p-3
   b) extend the constructor float to be able to build float from hexadecimal
   >>> x = float('0x1.2492492492492p-3')
 I don't know if we should add a "base=None" or not 
   c) extend the string formatting with '%a' (as in C since C99) and '{:a}'
  >>> s = '%a' % (x,)
 Serhly proposes to use '%x' and '{:x}', but I prefer to be consistent with 
C

To my point of view (my needs are maybe not very representative, as computer 
scientist working in computer arithmetic), a full support for radix-2 
representation is required (it is sometimes easier/quicker to exchange data 
between different softwares in plain text, and radix-2 hexadecimal is the best 
way to do it, because it is exact).
Also, support option a) will help me to generate python code (from other python 
or C code) without building the float at runtime with fromhex(). My numbers 
will be literals, not string converted in float!
Option c) will help me to print my data in the same way as in C, and be 
consistent (same formatting character)
And option b) will be just here for consistency with new hexadecimal literals...

Finally, I am now considering writing a PEP from Serhly Storchaka's idea, but 
if someone else wants to start it, I can help/contribute.

Thanks

Thibault


___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] Move some regrtest or test.support features into unittest?

2017-09-13 Thread Victor Stinner
Hi,

tl; dr How can we extend unittest module to plug new checks
before/after running tests?


The CPython project has a big test suite in the Lib/test/ directory.
While all tests are written with the unittest module and the
unittest.TestCase class, tests are not run directly by unittest, but
run by "regrtest" (for "regression test") which is a test runner doing
more checks (and more).


I would like to see if and how we can integrate/move some regrtest
features into the unittest module. Example of regrtest features:

* skip a test if it allocates too much memory, command line argument
to specify how many memory a test is allowed to allocate (ex:
--memlimit=2G for 2 GB of memory)

* concept of "resource" like "network" (connect to external network
servers, to the Internet), "cpu" (CPU intensive tests), etc. Tests are
skipped by default and enabled by the -u command line option (ex: "-u
cpu).

* track memory leaks: check the reference counter, check the number of
allocated memory blocks, check the number of open file descriptors.

* detect if the test spawned a thread or process and the
thread/process is still running at the test exit

* --timeout: watchdog killing the test if the run time exceed the
timeout in seconds (use faulthandler.dump_traceback_later)

* multiprocessing: run tests in subprocesses, in parallel

* redirect stdout/stderr to pipes (StringIO objects), ignore them on
success, or dump them to stdout/stderr on test failure

* --slowest: top 10 of the slowest tests

* --randomize: randomize test order

* --match, --matchfile, -x: filter tests

* --forever: run the test in a loop until it fails (or is interrupted by CTRL+c)

* --list-tests / --list-cases: list test files / test methods

* --fail-env-changed: mark tests as failed if a test altered the environment

* detect if a "global variable" of the standard library was modified
but not restored by the test:

resources = ('sys.argv', 'cwd', 'sys.stdin', 'sys.stdout', 'sys.stderr',
 'os.environ', 'sys.path', 'sys.path_hooks', '__import__',
 'warnings.filters', 'asyncore.socket_map',
 'logging._handlers', 'logging._handlerList', 'sys.gettrace',
 'sys.warnoptions',
 'multiprocessing.process._dangling', 'threading._dangling',
 'sysconfig._CONFIG_VARS', 'sysconfig._INSTALL_SCHEMES',
 'files', 'locale', 'warnings.showwarning',
 'shutil_archive_formats', 'shutil_unpack_formats',
)

* test.bisect: bisection to identify the failing method, used to track
memory leaks or identify a test leaking a resource (ex: create a file
but don't remove it)

* ... : regrtest has many many features


My question is also connected to test.support
(Lib/test/support/__init__.py): a big module containing a lot of
helper functions to write tests and to detect bugs in tests. For
example, @reap_children decorator emits a warnig if the test leaks a
child process (and reads its exit status to prevent zombie process).


I started to duplicate code in many files of Lib/test/test_*.py to
check if tests "leak running threads" ("dangling threads"). Example
from Lib/test/test_theading.py:

class BaseTestCase(unittest.TestCase):
def setUp(self):
self._threads = test.support.threading_setup()

def tearDown(self):
test.support.threading_cleanup(*self._threads)
test.support.reap_children()

I would like to get this test "for free" directly from the regular
unittest.TestCase class, but I don't know how to extend the unittest
module for that?

Victor
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 554: Stdlib Module to Support Multiple Interpreters in Python Code

2017-09-13 Thread Koos Zevenhoven
On Wed, Sep 13, 2017 at 6:14 AM, Nick Coghlan  wrote:

> On 13 September 2017 at 00:35, Koos Zevenhoven  wrote:>
>
> I don't see how the situation benefits from calling something the "main
> > interpreter". Subinterpreters can be a way to take something
> non-thread-safe
> > and make it thread-safe, because in an interpreter-per-thread scheme,
> most
> > of the state, like module globals, are thread-local. (Well, this doesn't
> > help for async concurrency, but anyway.)
>
> "The interpreter that runs __main__" is never going to go away as a
> concept for the regular CPython CLI.
>

It's still just *an* interpreter that happens to run __main__. And who says
it even needs to be the only one?


>
> Right now, its also a restriction even for applications like mod_wsgi,
> since the GIL state APIs always register C created threads with the
> main interpreter.
>
> >> That's OK - it just means we'll aim to make as many
> >> things as possible implicitly subinterpreter-friendly, and for
> >> everything else, we'll aim to minimise the adjustments needed to
> >> *make* things subinterpreter friendly.
> >>
> >
> > And that's exactly what I'm after here!
>
> No, you're after deliberately making the proposed API
> non-representative of how the reference implementation actually works
> because of a personal aesthetic preference rather than asking yourself
> what the practical benefit of hiding the existence of the main
> interpreter would be.
>
> The fact is that the main interpreter *is* special (just as the main
> thread is special), and your wishing that things were otherwise won't
> magically make it so.
>

​I'm not questioning whether the main interpreter is special, or whether
the interpreters may differ from each other. I'm questioning the whole
concept of "main interpreter". People should not care about which
interpreter is "the main ONE". They should care about what properties an
interpreter has. That's not aesthetics. Just look at, e.g. the
_decimal/_pydecimal examples in this thread.


> I'm mostly just worried about the `get_main()` function. Maybe it should
> be
> > called `asdfjaosjnoijb()`, so people wouldn't use it. Can't the first
> > running interpreter just introduce itself to its children? And if that's
> too
> > much to ask, maybe there could be a `get_parent()` function, which would
> > give you the interpreter that spawned the current subinterpreter.
>
> If the embedding application never calls
> "_Py_ConfigureMainInterpreter", then get_main() could conceivably
> return None. However, we don't expose that as a public API yet, so for
> the time being, Py_Initialize() will always call it, and hence there
> will always be a main interpreter (even in things like mod_wsgi).
>
>
You don't need to remove _Py_ConfigureMainInterpreter. Just make sure you
don't try to smuggle it into the status quo of the possibly upcoming new
stdlib module. Who knows what the function does anyway, let alone what it
might or might not do in the future.

Of course that doesn't mean that there couldn't be ways to configure an
interpreter, but coupling that with a concept of a "main interpreter", as
you suggest, doesn't seem to make any sense. And surely the code that
creates a new interpreter should know if it wants the new interpreter to
start with `__name__ == "__main__"` or `__name__ == "__just_any__", if
there is a choice.

––Koos


-- 
+ Koos Zevenhoven + http://twitter.com/k7hoven +
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] PEP 563: Postponed Evaluation of Annotations, first draft

2017-09-13 Thread Ivan Levkivskyi
> The difference in allocated memory is over 22 MB.
> The import time with annotations is over 2s longer.
> The problem with those numbers that we still have 80% functions to cover.

This will not be a problem with PEP 560 (I could imagine that string objects
may take actually more memory than relatively small cached objects).

Also I think it makes sense to mention in the PEP that stringifying
annotations
does not solve _all_ problems with forward references. For example, two
typical
situations are:

  T  = TypeVar('T', bound='Factory')

  class Factory:
  def make_copy(self: T) -> T:
  ...

and

  class Vertex(List['Edge']):
  ...
  class Edge:
  ends: Tuple[Vertex, Vertex]

Actually both situations can be resolved with PEP 563 if one puts
`T` after `Factory`, and `Vertex` after `Edge`, the latter is OK, but
the former would be strange. After all, it is OK to pay a _little_ price
for Python being an interpreted language.

There are other situations discussed in
https://github.com/python/typing/issues/400, I don't want to copy all
of them to the PEP, but I think this prior discussion should be referenced
in the PEP.

> This is not a viable strategy since __future__ is not designed to be
> a feature toggle but rather to be a gradual introduction of an upcoming
> breaking change.

But how it was with `from __future__ import division`? What I was proposing
is something similar, just have `from __future__ import annotations` that
will
be default in Python 4. (Although this time it would be a good idea to emit
DeprecationWarning one-two releases before Python 4).

--
Ivan
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/