Re: [Python-Dev] Possible rough edges in Python 3 metaclasses (was Re: Language reference updated for metaclasses)

2012-06-06 Thread Nick Coghlan
On Wed, Jun 6, 2012 at 1:28 AM, PJ Eby  wrote:
> To be clear, what I specifically proposed (as I mentioned in an earlier
> thread) was simply to patch __build_class__ in order to restore the missing
> __metaclass__ hook.  (Which, incidentally, would make ALL code using
> __metaclass__ cross-version compatible between 2.x and 3.x: a potentially
> valuable thing in and of itself!)
>
> As for metaclasses being hard to compose, PEP 422 is definitely a step in
> the right direction.  (Automatic metaclass combining is about the only thing
> that would improve it any further.)

Just as a warning, I doubt I'll be able to persuade enough people that
this is a feature worth including in the short time left before 3.3
feature freeze. It may end up being necessary to publish metaclass
and explicit decorator based variants (with their known limitations),
with a view to gaining support for inclusion in 3.4.

Alternatively, if people can supply examples of "post-creation
manipulation only" metaclasses that could be replaced with cleaner and
more composable dynamic decorator based solutions, that could help
make the PEP more compelling in the near term (perhaps compelling
enough to make it into 3.3).

Cheers,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Open PEPs and large-scale changes for 3.3

2012-06-06 Thread Ben Finney
Ben Finney  writes:

> Georg Brandl  writes:
>
> > list of possible features for 3.3 as specified by PEP 398:
> >
> > Candidate PEPs:
> […]
>
> > * PEP 3143: Standard daemon process library
>
> Our porting work will not be done in time for Python 3.3. I will update
> this to target Python 3.4.

The PEP document currently says it targets “3.x”. I'll leave it in that
state until we're more confident that the current work will be on track
for a particular Python release.

Do I need to do anything in particular to be explicit that PEP 3143 is
not coming in Python 3.3?

-- 
 \“Human reason is snatching everything to itself, leaving |
  `\   nothing for faith.” —Bernard of Clairvaux, 1090–1153 CE |
_o__)  |
Ben Finney

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Possible rough edges in Python 3 metaclasses (was Re: Language reference updated for metaclasses)

2012-06-06 Thread Nick Coghlan
On Wed, Jun 6, 2012 at 5:09 PM, Nick Coghlan  wrote:
> On Wed, Jun 6, 2012 at 1:28 AM, PJ Eby  wrote:
>> To be clear, what I specifically proposed (as I mentioned in an earlier
>> thread) was simply to patch __build_class__ in order to restore the missing
>> __metaclass__ hook.  (Which, incidentally, would make ALL code using
>> __metaclass__ cross-version compatible between 2.x and 3.x: a potentially
>> valuable thing in and of itself!)
>>
>> As for metaclasses being hard to compose, PEP 422 is definitely a step in
>> the right direction.  (Automatic metaclass combining is about the only thing
>> that would improve it any further.)
>
> Just as a warning, I doubt I'll be able to persuade enough people that
> this is a feature worth including in the short time left before 3.3
> feature freeze. It may end up being necessary to publish metaclass
> and explicit decorator based variants (with their known limitations),
> with a view to gaining support for inclusion in 3.4.

Upgrading this warning to a fact: there's no way this topic can be
given the consideration it deserves in the space of the next three
weeks. I'll be changing the title of 422, spend more time discussing
the problem (rather than leaping to a conclusion) and retargeting the
PEP at 3.4.

If you do decide to play around with monkeypatching __build_class__,
please make clear to any users that it's a temporary fix until
something more robust and less implementation dependent can be devised
for 3.4.

Cheers,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] TZ-aware local time

2012-06-06 Thread M.-A. Lemburg
Just to add my 2 cents to this discussion as someone who's worked
with mxDateTime for almost 15 years.

I think we all agree that users of an application want to input
date/time data using their local time (which may very well not be
the timezone of the system running the application). On output
they want to see their timezone as well, for obvious reasons.

Now timezones are by nature not strictly defined, they change very
often in history and what's worse: there's no way to predict the
timezone details for the future. In many places around the world,
the government defines the timezone data and they keep on changing
the aspects every now and then, support day light savings time,
drop the support, remove timezones for their countries, add
new ones, or simply shift to a different time zone.

The only timezone data that's more or less defined is historic
timezone data, but even there, different sources can give different
data.

What does this mean for the application ?

An application doesn't care about the timezone of a point in date/time.
It just wants a standard way to store the date/time and a reliable
way to work with it.

The most commonly used standard for this is the UTC standard and
so it's become good practice to convert all date/time values in
applications to UTC for storage, math and manipulation.

Just like with Unicode, the conversion to local time of the user
happens at the UI level. Conversion from input data to UTC is
easy, given the available C lib mechanisms (relying on the tz
database). Conversion from UTC to local time is more difficult,
but can also be done using the tz database.

The timezone information of the entered data or the user's
locale is usually available either through the environment,
a configuration file or a database storing the original
data - both on the input and on the output side. There's
no need to stick this information onto the basic data types,
since the application will already know anyway.

For most use cases, this strategy works out really well.
There are some cases, though, where you do need to work with
local time instead of UTC.

One such case is the definition of relative date/time values,
another related one, the definition of repeating date/time
values.

These are often defined by users in terms of their local
time or relative to other timezones they intend to
travel to, so in order to convert the definitions back
to UTC you have to run part of the calculation in the
resp. local time zone.

Repeating date/time values also tend to take other data
into account such as bank holidays, opening times, etc.
There's no end to making this more and more complicated :-)

However, these things are not in the realm of a basic
type anymore. They are application specific details.
As a result, it's better to provide tools to implement
all this, but not try force design decisions onto
the application writer (which will eventually get in
the way).

BTW: That's main reason why I have so far refused to add
native timezone support to the mxDateTime data types and
instead let the applications decide on what's the best way
for their particular use case. mxDateTime does provide
extra tools for timezone support, but doesn't get in the
way. It has so far worked out really well.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Jun 06 2012)
>>> Python/Zope Consulting and Support ...http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...http://python.egenix.com/

2012-07-17: Python Meeting Duesseldorf ... 41 days to go
2012-07-02: EuroPython 2012, Florence, Italy ...   26 days to go
2012-05-16: Released eGenix pyOpenSSL 0.13 ...http://egenix.com/go29

::: Try our new mxODBC.Connect Python Database Interface for free ! 


   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
   Registered at Amtsgericht Duesseldorf: HRB 46611
   http://www.egenix.com/company/contact/
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Static type analysis

2012-06-06 Thread Edward K. Ream
Hello all,

I'm wondering whether this is the appropriate place to discuss
(global) static type analysis, a topic Guido mentioned around the 28
min mark in his PyCon 2012 keynote,
http://pyvideo.org/video/956/keynote-guido-van-rossum

This is a topic that has interested me for a long time, and it has
important implications for Leo.  Just now I have some free time to
devote to it.

Edward
--
Edward K. Ream email: [email protected]
Leo: http://webpages.charter.net/edreamleo/front.html
--
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Isaac Morland

On Wed, 6 Jun 2012, Nick Coghlan wrote:


2. Signature.bind introduces the ability to split the "bind arguments
to parameters" operation from the "call object" operation


Has anybody considered calling bind __call__?  That is, the result of 
calling the signature of a procedure instead of the procedure itself is 
the locals() dictionary the procedure would start with (except presumably 
missing non-parameter local variables).


Isaac Morland   CSCF Web Guru
DC 2554C, x36650WWW Software Specialist
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
On 2012-06-06, at 9:28 AM, Isaac Morland wrote:

> On Wed, 6 Jun 2012, Nick Coghlan wrote:
> 
>> 2. Signature.bind introduces the ability to split the "bind arguments
>> to parameters" operation from the "call object" operation
> 
> Has anybody considered calling bind __call__?  That is, the result of calling 
> the signature of a procedure instead of the procedure itself is the locals() 
> dictionary the procedure would start with (except presumably missing 
> non-parameter local variables).

I'd stick with more explicit 'bind' method.

Compare (given the 'sig = signature(func)'):
   
ba = sig(*args, **kwargs)

to:

ba = sig.bind(*args, **kwargs)

The second case looks more clear to me.

Thanks,
-
Yury
  
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
On 2012-06-06, at 2:48 AM, Nick Coghlan wrote:
> However, looking at the code, I think the split that makes sense is
> for a lower level functools.signature to *only* support real function
> objects (i.e. not even method objects).
> 
> At the inspect layer, inspect.signature could then support retrieving
> a signature for an arbitrary callable roughly as follows:
> 
>def signature(f):
>try:
># Real functions are handled directly by functools
>return functools.signature(f)
>except TypeError:
>pass
># Not a function object, handle other kinds of callable
>if isclass(f):
># Figure out a Signature based on f.__new__ and f.__init__
># Complain if the signatures are contradictory
># Account for the permissive behaviour of object.__new__
> and object.__init__
>return class_signature
>if ismethod(f):
>f = f.__func__
>elif not isfunction(f):
>try:
>f = f.__call__
>except AttributeError:
>pass
>return signature(f) # Use recursion for the initial
> implementation sketch
> 
> That code is almost certainly wrong, but it should be enough to give
> the general idea. The short version is:
> 
> 1. Provide a functools.signature that expects ordinary function
> objects (or objects with a __signature__ attribute)
> 2. Provide an inspect.signature that also handles other kinds of callable

I like the idea of making 'signature' capable of introspecting any callable, 
be it a class, an object with __call__, a method, or a function.

However, I don't think we should have two 'signature'-related mechanisms 
available in two separate modules.  This will inevitably raise questions
about which one to use, and which is used in some piece of code you're
staring at ;)

I agree, that we shouldn't make 'functools' be dependent on 'inspect' module.
Moreover, this is not even currently possible, as it creates an import-loop 
that is hard to untie.  But how about the following:

1. Separate 'Signature' object from 'inspect' module, and move it to a 
private '_signature.py' (that will depend only on 'collections.OrderedDict',
'itertools.chain' and 'types')

2. Publish it in the 'inspect' module

3. Make 'signature' method to work with any callable

4. Make 'Signature' class to accept only functions

5. Import '_signature' in the 'functools', and use 'Signature' class
directly, as it will accept just plain functions.

Would this work?

-
Yury

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Open PEPs and large-scale changes for 3.3

2012-06-06 Thread Barry Warsaw
On Jun 06, 2012, at 05:55 PM, Ben Finney wrote:

>The PEP document currently says it targets “3.x”. I'll leave it in that
>state until we're more confident that the current work will be on track
>for a particular Python release.
>
>Do I need to do anything in particular to be explicit that PEP 3143 is
>not coming in Python 3.3?

Nope, I think that's fine.

-Barry
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Steven D'Aprano

Brett Cannon wrote:


PEP: 362
Title: Function Signature Object
Version: $Revision$
Last-Modified: $Date$
Author: Brett Cannon , Jiwon Seo ,
Yury Selivanov , Larry Hastings <
[email protected]>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 21-Aug-2006
Python-Version: 3.3
Post-History: 04-Jun-2012


Abstract


Python has always supported powerful introspection capabilities,
including introspecting functions and methods.  (For the rest of
this PEP, "function" refers to both functions and methods).  By
examining a function object you can fully reconstruct the function's
signature.  Unfortunately this information is stored in an inconvenient
manner, and is spread across a half-dozen deeply nested attributes.

This PEP proposes a new representation for function signatures.
The new representation contains all necessary information about a function
and its parameters, and makes introspection easy and straightforward.


It's already easy and straightforward, thanks to the existing 
inspect.getfullargspec function. If this existing inspect function is lacking 
something, the PEP should explain what, and why the inspect function can't be 
fixed.




However, this object does not replace the existing function
metadata, which is used by Python itself to execute those
functions.  The new metadata object is intended solely to make
function introspection easier for Python programmers.


What happens when the existing function metadata and the __signature__ object 
disagree?


Are there use-cases where we want them to disagree, or is disagreement always 
a sign that something is broken?





Signature Object


A Signature object represents the overall signature of a function.
It stores a `Parameter object`_ for each parameter accepted by the
function, as well as information specific to the function itself.


There's a considerable amount of data recorded, including a number of mappings 
(dicts?). This potentially increase the size of functions, and the overhead of 
creating them. Since most functions are never introspected, or only rarely 
introspected, it seems rather wasteful to record all this data "just in case", 
particularly since it's already recorded once in the function metadata and/or 
code object.





A Signature object has the following public attributes and methods:

* name : str
Name of the function.


Functions already record their name (twice!), and it is simple enough to query 
func.__name__. What reason is there for recording it a third time, in the 
Signature object?


Besides, I don't consider the name of the function part of the function's 
signature. Functions can have multiple names, or no name at all, and the 
calling signature remains the same.


Even if we limit the discussion to distinct functions (rather than a single 
function with multiple names), I consider spam(x, y, z) ham(x, y, z) and 
eggs(x, y, z) to have the same signature. Otherwise, it makes it difficult to 
talk about one function having the same signature as another function, unless 
they also have the same name. Which would be unfortunate.




* qualname : str
Fully qualified name of the function.


What's the fully qualified name of the function, and why is it needed?



[...]

The structure of the Parameter object is:



* is_args : bool
True if the parameter accepts variable number of arguments
(``\*args``-like), else False.



"args" is just a common name for the parameter, not for the kind of parameter. 
*args (or *data, *whatever) is a varargs parameter, and so the attribute 
should be called "is_varargs".




* is_kwargs : bool
True if the parameter accepts variable number of keyword
arguments (``\*\*kwargs``-like), else False.


Likewise for **kwargs (or **kw, etc.) I'm not sure if there is a common 
convention for keyword varargs, so I see two options:


is_varkwargs
is_kwvarargs



* is_implemented : bool
True if the parameter is implemented for use.  Some platforms
implement functions but can't support specific parameters
(e.g. "mode" for os.mkdir).  Passing in an unimplemented
parameter may result in the parameter being ignored,
or in NotImplementedError being raised.  It is intended that
all conditions where ``is_implemented`` may be False be
thoroughly documented.


What to do about parameters which are partly implemented? E.g. mode='spam' is 
implemented but mode='ham' is not.


Is there a use-case for is_implemented?

[...]

Annotation Checker



def check_type(sig, arg_name, arg_type, arg_value):
# Internal function that incapsulates arguments type checking


/s/incapsulates/encapsulates




Open Issues
===


inspect.getfullargspec is currently unable to introspect builtin functions and 
methods. Should builtins gain a __signature__ so they can be introspected?





When to construct the Signature object?
---

The Signature object can either 

Re: [Python-Dev] Possible rough edges in Python 3 metaclasses (was Re: Language reference updated for metaclasses)

2012-06-06 Thread PJ Eby
On Wed, Jun 6, 2012 at 5:31 AM, Nick Coghlan  wrote:

> On Wed, Jun 6, 2012 at 5:09 PM, Nick Coghlan  wrote:
> > On Wed, Jun 6, 2012 at 1:28 AM, PJ Eby  wrote:
> >> To be clear, what I specifically proposed (as I mentioned in an earlier
> >> thread) was simply to patch __build_class__ in order to restore the
> missing
> >> __metaclass__ hook.  (Which, incidentally, would make ALL code using
> >> __metaclass__ cross-version compatible between 2.x and 3.x: a
> potentially
> >> valuable thing in and of itself!)
> >>
> >> As for metaclasses being hard to compose, PEP 422 is definitely a step
> in
> >> the right direction.  (Automatic metaclass combining is about the only
> thing
> >> that would improve it any further.)
> >
> > Just as a warning, I doubt I'll be able to persuade enough people that
> > this is a feature worth including in the short time left before 3.3
> > feature freeze. It may end up being necessary to publish metaclass
> > and explicit decorator based variants (with their known limitations),
> > with a view to gaining support for inclusion in 3.4.
>
> Upgrading this warning to a fact: there's no way this topic can be
> given the consideration it deserves in the space of the next three
> weeks. I'll be changing the title of 422, spend more time discussing
> the problem (rather than leaping to a conclusion) and retargeting the
> PEP at 3.4.
>
> If you do decide to play around with monkeypatching __build_class__,
> please make clear to any users that it's a temporary fix until
> something more robust and less implementation dependent can be devised
> for 3.4.
>

Ideally, I would actually implement it as a backport of the PEP...  in
which case I suppose making it part of the class creation machinery (vs.
embedding it in type.__call__ or some place like that) will make that
process easier.

Again, as I said earlier, I'm talking about this now because there was
related discussion now, not because I'm actively trying to port my
libraries.  At this point, I've only done a few "make this library usable
from 3.x as-is" changes by user request, for some of my smaller libraries
that were mostly there already (e.g. simplegeneric).
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Mark Shannon

Steven D'Aprano wrote:

Brett Cannon wrote:


PEP: 362
Title: Function Signature Object
Version: $Revision$
Last-Modified: $Date$
Author: Brett Cannon , Jiwon Seo ,
Yury Selivanov , Larry Hastings <
[email protected]>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 21-Aug-2006
Python-Version: 3.3
Post-History: 04-Jun-2012


Abstract


Python has always supported powerful introspection capabilities,
including introspecting functions and methods.  (For the rest of
this PEP, "function" refers to both functions and methods).  By
examining a function object you can fully reconstruct the function's
signature.  Unfortunately this information is stored in an inconvenient
manner, and is spread across a half-dozen deeply nested attributes.

This PEP proposes a new representation for function signatures.
The new representation contains all necessary information about a 
function

and its parameters, and makes introspection easy and straightforward.


It's already easy and straightforward, thanks to the existing 
inspect.getfullargspec function. If this existing inspect function is 
lacking something, the PEP should explain what, and why the inspect 
function can't be fixed.




However, this object does not replace the existing function
metadata, which is used by Python itself to execute those
functions.  The new metadata object is intended solely to make
function introspection easier for Python programmers.


What happens when the existing function metadata and the __signature__ 
object disagree?


Are there use-cases where we want them to disagree, or is disagreement 
always a sign that something is broken?





Signature Object


A Signature object represents the overall signature of a function.
It stores a `Parameter object`_ for each parameter accepted by the
function, as well as information specific to the function itself.


There's a considerable amount of data recorded, including a number of 
mappings (dicts?). This potentially increase the size of functions, and 
the overhead of creating them. Since most functions are never 
introspected, or only rarely introspected, it seems rather wasteful to 
record all this data "just in case", particularly since it's already 
recorded once in the function metadata and/or code object.




I agree with Stephen. Don't forget that each list comprehension 
evaluation involves creating a temporary function object.






A Signature object has the following public attributes and methods:

* name : str
Name of the function.


Functions already record their name (twice!), and it is simple enough to 
query func.__name__. What reason is there for recording it a third time, 
in the Signature object?


Besides, I don't consider the name of the function part of the 
function's signature. Functions can have multiple names, or no name at 
all, and the calling signature remains the same.


Even if we limit the discussion to distinct functions (rather than a 
single function with multiple names), I consider spam(x, y, z) ham(x, y, 
z) and eggs(x, y, z) to have the same signature. Otherwise, it makes it 
difficult to talk about one function having the same signature as 
another function, unless they also have the same name. Which would be 
unfortunate.




* qualname : str
Fully qualified name of the function.


What's the fully qualified name of the function, and why is it needed?



[...]

The structure of the Parameter object is:



* is_args : bool
True if the parameter accepts variable number of arguments
(``\*args``-like), else False.



"args" is just a common name for the parameter, not for the kind of 
parameter. *args (or *data, *whatever) is a varargs parameter, and so 
the attribute should be called "is_varargs".




* is_kwargs : bool
True if the parameter accepts variable number of keyword
arguments (``\*\*kwargs``-like), else False.


Likewise for **kwargs (or **kw, etc.) I'm not sure if there is a common 
convention for keyword varargs, so I see two options:


is_varkwargs
is_kwvarargs



* is_implemented : bool
True if the parameter is implemented for use.  Some platforms
implement functions but can't support specific parameters
(e.g. "mode" for os.mkdir).  Passing in an unimplemented
parameter may result in the parameter being ignored,
or in NotImplementedError being raised.  It is intended that
all conditions where ``is_implemented`` may be False be
thoroughly documented.


What to do about parameters which are partly implemented? E.g. 
mode='spam' is implemented but mode='ham' is not.


Is there a use-case for is_implemented?

[...]

Annotation Checker



def check_type(sig, arg_name, arg_type, arg_value):
# Internal function that incapsulates arguments type checking


/s/incapsulates/encapsulates




Open Issues
===


inspect.getfullargspec is currently unable to introspect builtin 
functions and methods. Should builtins gain a __sign

Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Larry Hastings



On 06/06/2012 08:38 AM, Steven D'Aprano wrote:

What's the fully qualified name of the function, and why is it needed?


Please see PEP 3155.


"args" is just a common name for the parameter, not for the kind of 
parameter. *args (or *data, *whatever) is a varargs parameter, and so 
the attribute should be called "is_varargs".

[...]
Likewise for **kwargs (or **kw, etc.)


Yury will be pleased; those were his original names.  I argued for 
"is_args" and "is_kwargs".


I assert that "args" and "kwargs" are not merely "common name[s] for the 
parameter[s]", they are The Convention.  Any seasoned Python programmer 
examining a Signature object who sees "is_args" and "is_kwargs" will 
understand immediately what they are.  Jamming "var" in the middle of 
these names does not make their meaning any clearer--in fact I suggest 
it only detracts from readability.




Is there a use-case for is_implemented?


Yes, see issue 14626.


What happens when the existing function metadata and the __signature__ 
object disagree?


Are there use-cases where we want them to disagree, or is disagreement 
always a sign that something is broken?

[...]
"Changes to the Signature object, or to any of its data members,
do not affect the function itself."

which leaves the possibility that __signature__ may no longer match 
the actual argument spec, for some reason. If you remove 
getfullargspec, people will have to reinvent it to deal with such cases.


There's no reason why they should disagree.  The "some reason" would be 
if some doorknob decided to change it--the objects are mutable, because 
there's no good reason to make them immutable.


We just wanted to be explicit, that information flowed from the callable 
to the Signature and never the other way 'round.


As for "what would happen", nothing good.  My advice: don't change 
Signature objects for no reason.



//arry/
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
Steven,

On 2012-06-06, at 11:38 AM, Steven D'Aprano wrote:
> Brett Cannon wrote:
>> Python has always supported powerful introspection capabilities,
>> including introspecting functions and methods.  (For the rest of
>> this PEP, "function" refers to both functions and methods).  By
>> examining a function object you can fully reconstruct the function's
>> signature.  Unfortunately this information is stored in an inconvenient
>> manner, and is spread across a half-dozen deeply nested attributes.
>> This PEP proposes a new representation for function signatures.
>> The new representation contains all necessary information about a function
>> and its parameters, and makes introspection easy and straightforward.
> 
> It's already easy and straightforward, thanks to the existing 
> inspect.getfullargspec function. If this existing inspect function is lacking 
> something, the PEP should explain what, and why the inspect function can't be 
> fixed.

Well, the PEP addresses this question by saying: "Unfortunately this 
information is stored in an inconvenient manner, and is spread across 
a half-dozen deeply nested attributes."

And that's true.  As of now 'inspect.getfullargspec' returns a named
tuple, where all parameters are spread between different tuple items.
Essentially, what 'inspect.getfullargspec' really does, is simply
packing all the function and its code object attributes into a tuple.
Compare it with the elegance 'Signature' and 'Parameter' class
provides.

Example:

  def foo(a, bar:int=1):
 pass

With signature:
  
  sig = signature(foo)
  print('bar: annotation', sig.parameters['bar'].annotation)
  print('bar: default', sig.parameters['bar'].default)

end with 'inspect':

  args_spec = inspect.getfullargspec(foo)
  print('bar: default', args_spec.annotations['bar'])
  print('bar: default', args_spec.defaults[0]) # <- '0'?

Signature object is a much nicer API.  It becomes especially obvious
when you need to write more complicated stuff (you can see the PEP's
examples.)

>> However, this object does not replace the existing function
>> metadata, which is used by Python itself to execute those
>> functions.  The new metadata object is intended solely to make
>> function introspection easier for Python programmers.
> 
> What happens when the existing function metadata and the __signature__ object 
> disagree?

That's an issue to think about.  We may want to make 
'Signature.parameters' immutable by hiding it behind the newly added 
'types.MappingProxyType'.

But I don't foresee modifications of Signature objects to be somewhat
frequent.  It may happen for a very specific reason in a very specific
user code, and I see no point in denying this.

>> Signature Object
>> 
>> A Signature object represents the overall signature of a function.
>> It stores a `Parameter object`_ for each parameter accepted by the
>> function, as well as information specific to the function itself.
> 
> There's a considerable amount of data recorded, including a number of 
> mappings (dicts?). This potentially increase the size of functions, and the 
> overhead of creating them. Since most functions are never introspected, or 
> only rarely introspected, it seems rather wasteful to record all this data 
> "just in case", particularly since it's already recorded once in the function 
> metadata and/or code object.

The object is now created in a lazy manner, once requested.

>> A Signature object has the following public attributes and methods:
>> * name : str
>>Name of the function.
> 
> Functions already record their name (twice!), and it is simple enough to 
> query func.__name__. What reason is there for recording it a third time, in 
> the Signature object?

Signature object holds function's information and presents it in a
convenient manner.  It makes sense to store the function's name,
together with the information about its parameters and return 
annotation.

> Besides, I don't consider the name of the function part of the function's 
> signature. Functions can have multiple names, or no name at all, and the 
> calling signature remains the same.

It always have _one_ name it was defined with, unless it's
a lambda function.

> Even if we limit the discussion to distinct functions (rather than a single 
> function with multiple names), I consider spam(x, y, z) ham(x, y, z) and 
> eggs(x, y, z) to have the same signature. Otherwise, it makes it difficult to 
> talk about one function having the same signature as another function, unless 
> they also have the same name. Which would be unfortunate.

I see the point ;)  Let's see what other devs think.

>> * qualname : str
>>Fully qualified name of the function.
> 
> What's the fully qualified name of the function, and why is it needed?

See PEP 3155.

> [...]
>> The structure of the Parameter object is:
> 
>> * is_args : bool
>>True if the parameter accepts variable number of arguments
>>(``\*args``-like), else False.
> 
> 
> "args" i

Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Larry Hastings


On 06/06/2012 09:05 AM, Larry Hastings wrote:

Is there a use-case for is_implemented?


Yes, see issue 14626.


I should add, there are already some places in the standard library 
where is_implemented would be relevant.  The "mode" argument to os.mkdir 
comes immediately to mind; on Windows it is accepted but ignored.  A 
counter-example would be os.symlink, which takes an extra parameter on 
Windows that's  *not even accepted* on other platforms.


I am utterly convinced that, when faced with these sorts of 
platform-specific API differences, the first step towards sanity is to 
have the API accept a consistent signature everywhere.  What you do 
after that is up for debate--in most cases where the parameter causes a 
significant semantic change, I think specifying it with a non-default 
value should throw a NotImplementedError.  (With the specific case of 
os.mkdir on Windows, I can agree with silently ignoring the mode; it's 
not like the hapless Windows programmer could react and take a useful 
alternative approach.)


Parameter objects exposing is_implemented allows LBYL in these 
situations, rather than having to react to NotImplementedError.



//arry/
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Eric Snow
On Wed, Jun 6, 2012 at 10:20 AM, Yury Selivanov  wrote:
> On 2012-06-06, at 11:38 AM, Steven D'Aprano wrote:
>> Functions already record their name (twice!), and it is simple enough to 
>> query func.__name__. What reason is there for recording it a third time, in 
>> the Signature object?
>
> Signature object holds function's information and presents it in a
> convenient manner.  It makes sense to store the function's name,
> together with the information about its parameters and return
> annotation.
>
>> Besides, I don't consider the name of the function part of the function's 
>> signature. Functions can have multiple names, or no name at all, and the 
>> calling signature remains the same.
>
> It always have _one_ name it was defined with, unless it's
> a lambda function.
>
>> Even if we limit the discussion to distinct functions (rather than a single 
>> function with multiple names), I consider spam(x, y, z) ham(x, y, z) and 
>> eggs(x, y, z) to have the same signature. Otherwise, it makes it difficult 
>> to talk about one function having the same signature as another function, 
>> unless they also have the same name. Which would be unfortunate.
>
> I see the point ;)  Let's see what other devs think.

I'm with Steven on this one.  What's the benefit to storing the name
or qualname on the signature object?  That ties the signature object
to a specific function.  If you access the signature object by
f.__signature__ then you already have f and its name.  If you get it
by calling signature(f), then you also have f and the name.  If you
are passing signature objects for some reason and there's a use case
for which the name/qualname matters, wouldn't it be better to pass the
functions around anyway?  What about when you create a signature
object on its own and you don't care about the name or qualname...why
should it need them?  Does Signature.bind() need them?

FWIW, I think this PEP is great and am ecstatic that someone is
pushing it forward.  :)

-eric
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Steven D'Aprano

Larry Hastings wrote:


[...]
"Changes to the Signature object, or to any of its data members,
do not affect the function itself."

which leaves the possibility that __signature__ may no longer match 
the actual argument spec, for some reason. If you remove 
getfullargspec, people will have to reinvent it to deal with such cases.


There's no reason why they should disagree.  The "some reason" would be 
if some doorknob decided to change it--the objects are mutable, because 
there's no good reason to make them immutable.


Nevertheless, the world is full of doorknobs, and people will have to deal 
with their code.


The case for deprecating getfullargspec is weak. The case for deprecating it 
*right now* is even weaker. Let's not rush to throw away working code.



--
Steven

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Alexandre Zani
A question regarding the name. I have often seen the following pattern
in decorators:

def decor(f):
def some_func(a,b):
do_stuff using f
some_func.__name__ = f.__name__
return some_func

What are the name and fully qualified names in the signature for the
returned function? some_func.__name__ or f.__name__?

On Wed, Jun 6, 2012 at 10:02 AM, Eric Snow  wrote:
> On Wed, Jun 6, 2012 at 10:20 AM, Yury Selivanov  
> wrote:
>> On 2012-06-06, at 11:38 AM, Steven D'Aprano wrote:
>>> Functions already record their name (twice!), and it is simple enough to 
>>> query func.__name__. What reason is there for recording it a third time, in 
>>> the Signature object?
>>
>> Signature object holds function's information and presents it in a
>> convenient manner.  It makes sense to store the function's name,
>> together with the information about its parameters and return
>> annotation.
>>
>>> Besides, I don't consider the name of the function part of the function's 
>>> signature. Functions can have multiple names, or no name at all, and the 
>>> calling signature remains the same.
>>
>> It always have _one_ name it was defined with, unless it's
>> a lambda function.
>>
>>> Even if we limit the discussion to distinct functions (rather than a single 
>>> function with multiple names), I consider spam(x, y, z) ham(x, y, z) and 
>>> eggs(x, y, z) to have the same signature. Otherwise, it makes it difficult 
>>> to talk about one function having the same signature as another function, 
>>> unless they also have the same name. Which would be unfortunate.
>>
>> I see the point ;)  Let's see what other devs think.
>
> I'm with Steven on this one.  What's the benefit to storing the name
> or qualname on the signature object?  That ties the signature object
> to a specific function.  If you access the signature object by
> f.__signature__ then you already have f and its name.  If you get it
> by calling signature(f), then you also have f and the name.  If you
> are passing signature objects for some reason and there's a use case
> for which the name/qualname matters, wouldn't it be better to pass the
> functions around anyway?  What about when you create a signature
> object on its own and you don't care about the name or qualname...why
> should it need them?  Does Signature.bind() need them?
>
> FWIW, I think this PEP is great and am ecstatic that someone is
> pushing it forward.  :)
>
> -eric
> ___
> Python-Dev mailing list
> [email protected]
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> http://mail.python.org/mailman/options/python-dev/alexandre.zani%40gmail.com
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
Eric,

On 2012-06-06, at 1:02 PM, Eric Snow wrote:
> On Wed, Jun 6, 2012 at 10:20 AM, Yury Selivanov  
> wrote:
>> On 2012-06-06, at 11:38 AM, Steven D'Aprano wrote:
>>> Functions already record their name (twice!), and it is simple enough to 
>>> query func.__name__. What reason is there for recording it a third time, in 
>>> the Signature object?
>> 
>> Signature object holds function's information and presents it in a
>> convenient manner.  It makes sense to store the function's name,
>> together with the information about its parameters and return
>> annotation.
>> 
>>> Besides, I don't consider the name of the function part of the function's 
>>> signature. Functions can have multiple names, or no name at all, and the 
>>> calling signature remains the same.
>> 
>> It always have _one_ name it was defined with, unless it's
>> a lambda function.
>> 
>>> Even if we limit the discussion to distinct functions (rather than a single 
>>> function with multiple names), I consider spam(x, y, z) ham(x, y, z) and 
>>> eggs(x, y, z) to have the same signature. Otherwise, it makes it difficult 
>>> to talk about one function having the same signature as another function, 
>>> unless they also have the same name. Which would be unfortunate.
>> 
>> I see the point ;)  Let's see what other devs think.
> 
> I'm with Steven on this one.  What's the benefit to storing the name
> or qualname on the signature object?  That ties the signature object
> to a specific function.  If you access the signature object by
> f.__signature__ then you already have f and its name.  If you get it
> by calling signature(f), then you also have f and the name.  If you
> are passing signature objects for some reason and there's a use case
> for which the name/qualname matters, wouldn't it be better to pass the
> functions around anyway?  What about when you create a signature
> object on its own and you don't care about the name or qualname...why
> should it need them?  Does Signature.bind() need them?

Yes, 'Signature.bind' needs 'qualname' for error messages.  But it can be
stored as a private attribute.

I like the idea of 'foo(a)' and 'bar(a)' having the identical signatures,
however, I don't think it's possible.  I.e. we can't make it that the
'signature(foo) is signature(bar)'.  We can implement the __eq__ operator
though.

For me, the signature of a function is not just a description of its 
parameters, so it seems practical to store its name too.

> FWIW, I think this PEP is great and am ecstatic that someone is
> pushing it forward.  :)


Thanks ;)

-
Yury
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Brett Cannon
On Wed, Jun 6, 2012 at 12:16 PM, Steven D'Aprano wrote:

> Larry Hastings wrote:
>
>  [...]
>>> "Changes to the Signature object, or to any of its data members,
>>> do not affect the function itself."
>>>
>>> which leaves the possibility that __signature__ may no longer match the
>>> actual argument spec, for some reason. If you remove getfullargspec, people
>>> will have to reinvent it to deal with such cases.
>>>
>>
>> There's no reason why they should disagree.  The "some reason" would be
>> if some doorknob decided to change it--the objects are mutable, because
>> there's no good reason to make them immutable.
>>
>
> Nevertheless, the world is full of doorknobs, and people will have to deal
> with their code.
>

This is also Python, the language that assumes everyone is an consenting
adult.


>
> The case for deprecating getfullargspec is weak. The case for deprecating
> it *right now* is even weaker. Let's not rush to throw away working code.
>
>
If people really want to keep getullargspec() around then I want to at
least add a note to the function that signature objects exist as an
alternative (but not vice-versa). I personally still regret the
getopt/argparse situation and this feels like that on a smaller scale.

-Brett


>
>
> --
> Steven
>
> __**_
> Python-Dev mailing list
> [email protected]
> http://mail.python.org/**mailman/listinfo/python-dev
> Unsubscribe: http://mail.python.org/**mailman/options/python-dev/**
> brett%40python.org
>
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Larry Hastings


On 06/06/2012 09:16 AM, Steven D'Aprano wrote:
Nevertheless, the world is full of doorknobs, and people will have to 
deal with their code.


I'm having a hard time seeing it.  Can you propose a credible situation 
where


 * some programmer would have a reason (even a bad reason) to modify
   the cached Signature for a function,
 * as a result it would no longer correctly match the corresponding
   function,
 * you would be forced to interact with this code and the modified
   Signature, and
 * it would cause you problems?

If you can, what adjustment would you make to the PEP to ameliorate this 
situation?


And, as Brett cites, the consenting adults rule applies here.


All of a sudden I'm thinking of "Olsen's Standard Book Of British Birds",


//arry/
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
On 2012-06-06, at 1:13 PM, Alexandre Zani wrote:
> A question regarding the name. I have often seen the following pattern
> in decorators:
> 
> def decor(f):
>def some_func(a,b):
>do_stuff using f
>some_func.__name__ = f.__name__
>return some_func
> 
> What are the name and fully qualified names in the signature for the
> returned function? some_func.__name__ or f.__name__?

Never copy attributes by hand, always use 'functools.wraps'.  It copies
'__name__', '__qualname__', and bunch of other attributes to the decorator 
object.

We'll probably extend it to copy __signature__ too; then 'signature(decor(f))'
will be the same as 'signature(f)'.

-
Yury
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Daniel Urban
> BoundArguments Object
> =
>
> Result of a ``Signature.bind`` call.  Holds the mapping of arguments
> to the function's parameters.

The Signature.bind function has changed since the previous version of
the PEP. If I understand correctly, the 'arguments' attribute is the
same as the return value of bind in the previous version (more or less
the same as the return value of inspect.getcallargs). My question is:
why we need the other 2 attributes ('args' and 'kwargs')? The
"Annotation Checker" example uses it to call the function. But if we
are able to call bind, we already have the arguments, so we can simply
call the function with them, we don't really need these attributes. I
think it would be better (easier to use and understand), if bind would
simply return the mapping, as in the previous version of the PEP.

> Has the following public attributes:
>
> * arguments : OrderedDict
>     An ordered mutable mapping of parameters' names to arguments' values.
>     Does not contain arguments' default values.

Does this mean, that if the arguments passed to bind doesn't contain a
value for an argument that has a default value, then the returned
mapping won't contain that argument? If so, why not?
inspect.getcallargs works fine with default values.


Daniel
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Larry Hastings



Sorry I missed answering these on my first pass.

On 06/06/2012 08:38 AM, Steven D'Aprano wrote:
What to do about parameters which are partly implemented? E.g. 
mode='spam' is implemented but mode='ham' is not.


Parameter objects aren't sophisticated enough to represent such a 
situation.  If you have a use case for a more sophisticated approach, 
and can propose a change to the Parameter object to handle it, I'd be 
interested to see it.


In truth, the way I currently support those "unimplemented" parameters 
is, passing in the default parameter is still permitted.  So in a way I 
suppose I already have this situation, kinda?  But is_implemented as it 
stands works fine for my use case.


inspect.getfullargspec is currently unable to introspect builtin 
functions and methods. Should builtins gain a __signature__ so they 
can be introspected?


If function signatures are useful, then they're useful, and the 
implementation language for the function is irrelevant.  I already sent 
Yuri a patch adding __signature__ to PyCFunctionObject, which I thought 
he merged but I don't see in his repo.


The problem (obviously) is generating the signature.  Brett has an idea 
about parsing the docstring; it strikes me as hackish.  I think solving 
the problem definitively will require a new argument parsing API and 
that's simply not happening for 3.3.


If my patch for issue 14626 and PEP 362 both land in 3.3, my plan is to 
hard-code the signatures for just those functions.



//arry/
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
Daniel,

On 2012-06-06, at 1:39 PM, Daniel Urban wrote:

>> BoundArguments Object
>> =
>> 
>> Result of a ``Signature.bind`` call.  Holds the mapping of arguments
>> to the function's parameters.
> 
> The Signature.bind function has changed since the previous version of
> the PEP. If I understand correctly, the 'arguments' attribute is the
> same as the return value of bind in the previous version (more or less
> the same as the return value of inspect.getcallargs). My question is:
> why we need the other 2 attributes ('args' and 'kwargs')? The
> "Annotation Checker" example uses it to call the function. But if we
> are able to call bind, we already have the arguments, so we can simply
> call the function with them, we don't really need these attributes. I
> think it would be better (easier to use and understand), if bind would
> simply return the mapping, as in the previous version of the PEP.

I'll try to answer you with the following code:

   >>> def foo(*args):
   ...print(args)
 
   >>> bound_args = signature(foo).bind(1, 2, 3)
   >>> bound_args.arguments
   OrderedDict([('args', (1, 2, 3))])

You can't invoke 'foo' by:

   >>> foo(**bound_args.arguments)
   TypeError: foo() got an unexpected keyword argument 'args'

That's why we have two dynamic properties 'args', and 'kwargs':

   >>> bound_args.args, bound_args.kwargs
   ((1, 2, 3), {})

'BoundArguments.arguments', as told in the PEP, is a mapping to work
with 'Signature.parameters' (you should've seen it in the
"Annotation Checker" example).

'args' & 'kwargs' are for invocation.  You can even modify 'arguments'.

>> Has the following public attributes:
>> 
>> * arguments : OrderedDict
>> An ordered mutable mapping of parameters' names to arguments' values.
>> Does not contain arguments' default values.
> 
> Does this mean, that if the arguments passed to bind doesn't contain a
> value for an argument that has a default value, then the returned
> mapping won't contain that argument? If so, why not?
> inspect.getcallargs works fine with default values.

Yes, it won't.  It contains only arguments you've passed to the 'bind'.
The reason is because we'd like to save as much of actual context as
possible.

If you pass some set of arguments to the bind() method, it tries to map
precisely that set.  This way you can deduce from the BoundArguments what
it was bound with.  And default values will applied by python itself.

-
Yury
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Daniel Urban
>>> BoundArguments Object
>>> =
>>>
>>> Result of a ``Signature.bind`` call.  Holds the mapping of arguments
>>> to the function's parameters.
>>
>> The Signature.bind function has changed since the previous version of
>> the PEP. If I understand correctly, the 'arguments' attribute is the
>> same as the return value of bind in the previous version (more or less
>> the same as the return value of inspect.getcallargs). My question is:
>> why we need the other 2 attributes ('args' and 'kwargs')? The
>> "Annotation Checker" example uses it to call the function. But if we
>> are able to call bind, we already have the arguments, so we can simply
>> call the function with them, we don't really need these attributes. I
>> think it would be better (easier to use and understand), if bind would
>> simply return the mapping, as in the previous version of the PEP.
>
> I'll try to answer you with the following code:
>
>   >>> def foo(*args):
>   ...    print(args)
>
>   >>> bound_args = signature(foo).bind(1, 2, 3)
>   >>> bound_args.arguments
>   OrderedDict([('args', (1, 2, 3))])
>
> You can't invoke 'foo' by:
>
>   >>> foo(**bound_args.arguments)
>   TypeError: foo() got an unexpected keyword argument 'args'

Of course, but you can invoke it with "1, 2, 3", the arguments you
used to create the BoundArguments instance in the first place: foo(1,
2, 3) will work fine.

> That's why we have two dynamic properties 'args', and 'kwargs':

Ok, but what I'm saying is, that we don't really need them.

>   >>> bound_args.args, bound_args.kwargs
>   ((1, 2, 3), {})
>
> 'BoundArguments.arguments', as told in the PEP, is a mapping to work
> with 'Signature.parameters' (you should've seen it in the
> "Annotation Checker" example).
>
> 'args' & 'kwargs' are for invocation.  You can even modify 'arguments'.
>
>>> Has the following public attributes:
>>>
>>> * arguments : OrderedDict
>>>     An ordered mutable mapping of parameters' names to arguments' values.
>>>     Does not contain arguments' default values.
>>
>> Does this mean, that if the arguments passed to bind doesn't contain a
>> value for an argument that has a default value, then the returned
>> mapping won't contain that argument? If so, why not?
>> inspect.getcallargs works fine with default values.
>
> Yes, it won't.  It contains only arguments you've passed to the 'bind'.
> The reason is because we'd like to save as much of actual context as
> possible.

I don't really see, where this "context" can be useful. Maybe an
example would help.

> If you pass some set of arguments to the bind() method, it tries to map
> precisely that set.  This way you can deduce from the BoundArguments what
> it was bound with.  And default values will applied by python itself.

Anyway, I think it would be nice to be able to obtain the full
arguments mapping that the function would see, not just a subset. Of
course, we can use inspect.getcallargs for that, but I think we should
be able to do that with the new Signature API.


Daniel
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
On 2012-06-06, at 2:22 PM, Daniel Urban wrote:
>> I'll try to answer you with the following code:
>> 
>>   >>> def foo(*args):
>>   ...print(args)
>> 
>>   >>> bound_args = signature(foo).bind(1, 2, 3)
>>   >>> bound_args.arguments
>>   OrderedDict([('args', (1, 2, 3))])
>> 
>> You can't invoke 'foo' by:
>> 
>>   >>> foo(**bound_args.arguments)
>>   TypeError: foo() got an unexpected keyword argument 'args'
> 
> Of course, but you can invoke it with "1, 2, 3", the arguments you
> used to create the BoundArguments instance in the first place: foo(1,
> 2, 3) will work fine.

The whole point is to use BoundArguments mapping for invocation.
See Nick's idea to validate callbacks, and my response to him, below in
the thread.

>> That's why we have two dynamic properties 'args', and 'kwargs':
> 
> Ok, but what I'm saying is, that we don't really need them.

We need them.  Again, in some contexts you don't have the arguments
you've passed to bind().

>>   >>> bound_args.args, bound_args.kwargs
>>   ((1, 2, 3), {})
>> 
>> 'BoundArguments.arguments', as told in the PEP, is a mapping to work
>> with 'Signature.parameters' (you should've seen it in the
>> "Annotation Checker" example).
>> 
>> 'args' & 'kwargs' are for invocation.  You can even modify 'arguments'.
>> 
 Has the following public attributes:
 
 * arguments : OrderedDict
 An ordered mutable mapping of parameters' names to arguments' values.
 Does not contain arguments' default values.
>>> 
>>> Does this mean, that if the arguments passed to bind doesn't contain a
>>> value for an argument that has a default value, then the returned
>>> mapping won't contain that argument? If so, why not?
>>> inspect.getcallargs works fine with default values.
>> 
>> Yes, it won't.  It contains only arguments you've passed to the 'bind'.
>> The reason is because we'd like to save as much of actual context as
>> possible.
> 
> I don't really see, where this "context" can be useful. Maybe an
> example would help.

For instance, for some sort of RPC mechanism, where you don't need to 
store/pass arguments that have default values.

>> If you pass some set of arguments to the bind() method, it tries to map
>> precisely that set.  This way you can deduce from the BoundArguments what
>> it was bound with.  And default values will applied by python itself.
> 
> Anyway, I think it would be nice to be able to obtain the full
> arguments mapping that the function would see, not just a subset. Of
> course, we can use inspect.getcallargs for that, but I think we should
> be able to do that with the new Signature API.


Well, it will take just a few lines of code to enrich BoundArguments with
default values (we can add a method to do it, if it's really that required
feature).  But you won't be able to ever reconstruct what arguments the 
bind() method was called with, if we write default values to arguments 
from start.

Also, it's better for performance.  "Annotation Checker" example does 
defaults validation first, and never checks them again.  If bind() would
write default values to 'BoundArguments.arguments', you would check
defaults on each call.  And think about more complicated cases, where
processing of argument's value is more complicated and time consuming.

All in all, I consider the way 'inspect.getcallargs' treats defaults 
as a weakness, not as an advantage.

-
Yury
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] peps: PEP 422 rewrite to present an idea that a) isn't crazy and b) it turns out

2012-06-06 Thread Terry Reedy



On 6/6/2012 7:40 AM, nick.coghlan wrote:

   PEP 422 rewrite to present an idea that a) isn't crazy and

> b) it turns out Thomas Heller proposed back in 2001


+There is currently no corresponding mechanism in Python 3 that allows the
+code executed in the class body to directly influence how the class object
+is created. Instead, the class creation process is fully defined by the
+class header, before the class body even begins executing.


This makes the problem for porting code much clearer.

> +* If the metaclass hint refers to an instance of ``type``, then it is

/instance/subclass/? as in your class Metaclass(type) example in 
Alternatives?


> +  considered as a candidate metaclass along with the metaclasses of a



+Easier inheritance of definition time behaviour
+---
+
+Understanding Python's metaclass system requires a deep understanding of
+the type system and the class construction process. This is legitimately
+seen as confusing, due to the need to keep multiple moving parts (the code,


/confusing/challenging/

The challenge is inherent in the topic. Confusion is not, but is a sign 
of poor explication that needs improvement.



+the metaclass hint, the actual metaclass, the class object, instances of the
+class object) clearly distinct in your mind.


Your clear separation of 'metaclass hint' from 'actual metaclass' and 
enumeration of the multiple parts has reduced confusion, at least for 
me. But it remains challenging.



+Understanding the proposed class initialisation hook requires understanding
+decorators and ordinary method inheritance, which is a much simpler prospect.


/much// (in my opinion) In other words, don't underplay the alternative 
challenge ;-).



tjr

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Daniel Urban
On Wed, Jun 6, 2012 at 8:35 PM, Yury Selivanov  wrote:
> On 2012-06-06, at 2:22 PM, Daniel Urban wrote:
>>> I'll try to answer you with the following code:
>>>
>>>   >>> def foo(*args):
>>>   ...    print(args)
>>>
>>>   >>> bound_args = signature(foo).bind(1, 2, 3)
>>>   >>> bound_args.arguments
>>>   OrderedDict([('args', (1, 2, 3))])
>>>
>>> You can't invoke 'foo' by:
>>>
>>>   >>> foo(**bound_args.arguments)
>>>   TypeError: foo() got an unexpected keyword argument 'args'
>>
>> Of course, but you can invoke it with "1, 2, 3", the arguments you
>> used to create the BoundArguments instance in the first place: foo(1,
>> 2, 3) will work fine.
>
> The whole point is to use BoundArguments mapping for invocation.
> See Nick's idea to validate callbacks, and my response to him, below in
> the thread.
>
>>> That's why we have two dynamic properties 'args', and 'kwargs':
>>
>> Ok, but what I'm saying is, that we don't really need them.
>
> We need them.  Again, in some contexts you don't have the arguments
> you've passed to bind().

But how could we *need* bind to return 'args' and 'kwargs' to us, when
we wouldn't be able to call bind in the first place, if we wouldn't
had the arguments?

>>>   >>> bound_args.args, bound_args.kwargs
>>>   ((1, 2, 3), {})
>>>
>>> 'BoundArguments.arguments', as told in the PEP, is a mapping to work
>>> with 'Signature.parameters' (you should've seen it in the
>>> "Annotation Checker" example).
>>>
>>> 'args' & 'kwargs' are for invocation.  You can even modify 'arguments'.
>>>
> Has the following public attributes:
>
> * arguments : OrderedDict
>     An ordered mutable mapping of parameters' names to arguments' values.
>     Does not contain arguments' default values.

 Does this mean, that if the arguments passed to bind doesn't contain a
 value for an argument that has a default value, then the returned
 mapping won't contain that argument? If so, why not?
 inspect.getcallargs works fine with default values.
>>>
>>> Yes, it won't.  It contains only arguments you've passed to the 'bind'.
>>> The reason is because we'd like to save as much of actual context as
>>> possible.
>>
>> I don't really see, where this "context" can be useful. Maybe an
>> example would help.
>
> For instance, for some sort of RPC mechanism, where you don't need to
> store/pass arguments that have default values.

I see. So, basically, it's an optimization.

>>> If you pass some set of arguments to the bind() method, it tries to map
>>> precisely that set.  This way you can deduce from the BoundArguments what
>>> it was bound with.  And default values will applied by python itself.
>>
>> Anyway, I think it would be nice to be able to obtain the full
>> arguments mapping that the function would see, not just a subset. Of
>> course, we can use inspect.getcallargs for that, but I think we should
>> be able to do that with the new Signature API.
>
>
> Well, it will take just a few lines of code to enrich BoundArguments with
> default values (we can add a method to do it, if it's really that required
> feature).  But you won't be able to ever reconstruct what arguments the
> bind() method was called with, if we write default values to arguments
> from start.

As I've mentioned above, I don't think, we have to be able to
reconstruct the arguments passed to bind from the return value of
bind. If we will need the original arguments later/in another place,
we will just save them, bind doesn't have to complicate its API with
them.

> Also, it's better for performance.  "Annotation Checker" example does
> defaults validation first, and never checks them again.  If bind() would
> write default values to 'BoundArguments.arguments', you would check
> defaults on each call.  And think about more complicated cases, where
> processing of argument's value is more complicated and time consuming.

Ok, so again, it is an optimization.


Daniel
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
On 2012-06-06, at 3:33 PM, Daniel Urban wrote:
> On Wed, Jun 6, 2012 at 8:35 PM, Yury Selivanov  
> wrote:
>> On 2012-06-06, at 2:22 PM, Daniel Urban wrote:
 I'll try to answer you with the following code:
 
   >>> def foo(*args):
   ...print(args)
 
   >>> bound_args = signature(foo).bind(1, 2, 3)
   >>> bound_args.arguments
   OrderedDict([('args', (1, 2, 3))])
 
 You can't invoke 'foo' by:
 
   >>> foo(**bound_args.arguments)
   TypeError: foo() got an unexpected keyword argument 'args'
>>> 
>>> Of course, but you can invoke it with "1, 2, 3", the arguments you
>>> used to create the BoundArguments instance in the first place: foo(1,
>>> 2, 3) will work fine.
>> 
>> The whole point is to use BoundArguments mapping for invocation.
>> See Nick's idea to validate callbacks, and my response to him, below in
>> the thread.
>> 
 That's why we have two dynamic properties 'args', and 'kwargs':
>>> 
>>> Ok, but what I'm saying is, that we don't really need them.
>> 
>> We need them.  Again, in some contexts you don't have the arguments
>> you've passed to bind().
> 
> But how could we *need* bind to return 'args' and 'kwargs' to us, when
> we wouldn't be able to call bind in the first place, if we wouldn't
> had the arguments?

You're missing the point.  BoundArguments contains properly mapped
*args and **kwargs passed to Signature.bind.  You can validate them after,
do type casts, modify them, overwrite etc. by manipulating
'BoundArguments.arguments'.

At the end you can't, however, invoke the function by doing:

   func(**bound_arguments.arguments) # <- this won't work

as varargs will be screwed.

That's why you need 'args' & 'kwargs' properties on BoundArguments.

Imagine, that "Annotation Checker" example is modified to coerce all string
arguments to int (those that had 'int' in annotation) and then to multiply
them by 42.

We'd write the following code:
 
   for arg_name, arg_value in bound_arguments.arguments.items():
  # I'm skipping is_args & is_kwargs checks, and assuming
  # we have annotations everywhere
  if sig.parameters[arg_name].annotation is int \
and isinstance(arg_value, str):
  bound_arguments.arguments[arg_name] = int(arg_value) * 42

   return func(*bound_arguments.args, **bound_arguments.kwargs)

Thanks,

-
Yury
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Nick Coghlan
On Jun 7, 2012 3:11 AM, "Steven D'Aprano"  wrote:
>
> Larry Hastings wrote:
>
>>> [...]
>>> "Changes to the Signature object, or to any of its data members,
>>> do not affect the function itself."
>>>
>>> which leaves the possibility that __signature__ may no longer match the
actual argument spec, for some reason. If you remove getfullargspec, people
will have to reinvent it to deal with such cases.
>>
>>
>> There's no reason why they should disagree.  The "some reason" would be
if some doorknob decided to change it--the objects are mutable, because
there's no good reason to make them immutable.
>
>
> Nevertheless, the world is full of doorknobs, and people will have to
deal with their code.

Speaking as a doorknob, I plan to use this PEP to allow wrapper functions
that accept arbitrary arguments to accurately report their signature as
matching the underlying function. It will also be useful for allowing
partial() objects (and other callables that tweak their API) to report an
accurate signature.

For example, I believe bound methods will misrepresent their signature with
the current PEP implementation - they actually should copy the function
signature and then drop the first positional parameter.

However, these use cases would be easier to handle correctly with an
explicit "copy()" method.

Also, +1 for keeping the lower level inspect functions around.

Cheers,
Nick.

--
Sent from my phone, thus the relative brevity :)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Nick Coghlan
On Jun 7, 2012 12:20 AM, "Yury Selivanov"  wrote:

>
> I agree, that we shouldn't make 'functools' be dependent on 'inspect'
module.
> Moreover, this is not even currently possible, as it creates an
import-loop
> that is hard to untie.  But how about the following:
>
> 1. Separate 'Signature' object from 'inspect' module, and move it to a
> private '_signature.py' (that will depend only on
'collections.OrderedDict',
> 'itertools.chain' and 'types')
>
> 2. Publish it in the 'inspect' module
>
> 3. Make 'signature' method to work with any callable
>
> 4. Make 'Signature' class to accept only functions
>
> 5. Import '_signature' in the 'functools', and use 'Signature' class
> directly, as it will accept just plain functions.
>
> Would this work?

Sounds like a good plan to me.

Cheers,
Nick.

--
Sent from my phone, thus the relative brevity :)
>
> -
> Yury
>
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] peps: PEP 422 rewrite to present an idea that a) isn't crazy and b) it turns out

2012-06-06 Thread PJ Eby
+1 on the PEP.  FWIW, it may be useful to note that not only has the
pattern of having a class-level init been proposed before, it's actually
used: Zope has had __class_init__ and used it as a metaclass alternative
since well before Thomas Heller's proposal.

And in my case, about 80% of my non-dynamic metaclass needs are handled by
using a metaclass whose sole purpose is to provide me with __class_init__,
__class_new__, and __class_call__ methods so they can be defined as class
methods instead of as metaclass methods.  (Basically, it lets me avoid
making new metaclasses when I can just define __class_*__ methods instead.
The other use cases are all esoterica like object-relational mapping,
singletons and pseudo-singletons. etc.)

So, the concept is a decades-plus proven alternative to metaclasses for
low-hanging metaclassy behavior.

This new version of the PEP does offer one challenge to my motivating use
case, though, and that's that hooking __init_class__ means any in-body
decorators have to occur *after* any __init_class__ definition, or silent
failure will occur.  (Because a later definition of __init_class__ will
overwrite the dynamically added version.)

While this challenge existed for the __metaclass__ hook, it was by
convention always placed at the top of the class, or very near to it.
After all, knowing what metaclass a class is, is pretty important, *and*
not done very often.  Likewise, had the previous version of the PEP been
used, it was unlikely that anybody would bury their __decorators__ list
near the end of the class! The __init_class__ method, on the other hand,
can quite rightly be considered a minor implementation detail internal to a
class that might reasonably be placed late in the class definition.

This is a relatively small apprehension, but it makes me *slightly* prefer
the previous version to this one, at least for my motivating use case.  But
I'll admit that this version might be better for Python-as-a-whole than the
previous version.  Among other things, it makes my "classy" metaclass (the
one that adds __class_init__, __class_call__, etc.) redundant for its most
common usage (__class_init__).

I'm tempted to suggest adding a __call_class__ to the mix, since in
grepping my code to check my less-esoteric metaclass use cases just now, I
find I implement __class_call__ methods almost as often as __class_init__
ones, but I suspect my use cases are atypical in this regard.  (It's mostly
used for things where you want to hook instance creation (caches,
singletons, persistence, O-R mapping) while still allowing subclasses to
define __new__ and/or __init__ without needing to integrate with the tricky
bits.)

(To be clear, by esoteric, I mean cases where I'm making classes that act
like non-class objects in some regard, like a class that acts as a mapping
or sequence of its instances.  If all you're doing is making a class with a
sprinkling of metaprogramming for improved DRYness, then __class_init__ and
__class_call__ are more than enough to do it, and a full metaclass is
overkill.)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Static type analysis

2012-06-06 Thread Terry Reedy

On 6/6/2012 7:24 AM, Edward K. Ream wrote:

Hello all,

I'm wondering whether this is the appropriate place to discuss
(global) static type analysis, a topic Guido mentioned around the 28
min mark in his PyCon 2012 keynote,
http://pyvideo.org/video/956/keynote-guido-van-rossum


I think either python-list or python-ideas list would be more 
appropriate. Start with a proposal, statement, or question that others 
can respond to.



--
Terry Jan Reedy

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] peps: PEP 422 rewrite to present an idea that a) isn't crazy and b) it turns out

2012-06-06 Thread Eric Snow
On Wed, Jun 6, 2012 at 5:40 AM, nick.coghlan  wrote:
> +
> +Alternatives
> +
> +

Would it be worth also (briefly) rehashing why the class instance
couldn't be created before the class body is executed*?  It might seem
like a viable alternative if you haven't looked at how classes get
created.

-eric


* i.e. meta.__new__() would have to be called before the class body is
executed for the class to exist during that execution.  Perhaps in an
alternate universe classes get created like modules do...
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Steven D'Aprano

Yury Selivanov wrote:


I like the idea of 'foo(a)' and 'bar(a)' having the identical signatures,
however, I don't think it's possible.  I.e. we can't make it that the
'signature(foo) is signature(bar)'.  We can implement the __eq__ operator
though.


+1 to __eq__.

I don't think we should care about them being identical. Object identity is 
almost always an implementation detail.




--
Steven

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Steven D'Aprano

Larry Hastings wrote:

inspect.getfullargspec is currently unable to introspect builtin 
functions and methods. Should builtins gain a __signature__ so they 
can be introspected?


If function signatures are useful, then they're useful, and the 
implementation language for the function is irrelevant.  I already sent 
Yuri a patch adding __signature__ to PyCFunctionObject, which I thought 
he merged but I don't see in his repo.


I would love to be able to inspect builtins for their signature. I have a 
class decorator that needs to know the signature of the class constructor, 
which unfortunately falls down for the simple cases of inheriting from a 
builtin without a custom __init__ or __new__.



--
Steven

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Steven D'Aprano

Brett Cannon wrote:

On Wed, Jun 6, 2012 at 12:16 PM, Steven D'Aprano wrote:


Larry Hastings wrote:

 [...]

"Changes to the Signature object, or to any of its data members,
do not affect the function itself."

which leaves the possibility that __signature__ may no longer match the
actual argument spec, for some reason. If you remove getfullargspec, people
will have to reinvent it to deal with such cases.


There's no reason why they should disagree.  The "some reason" would be
if some doorknob decided to change it--the objects are mutable, because
there's no good reason to make them immutable.


Nevertheless, the world is full of doorknobs, and people will have to deal
with their code.



This is also Python, the language that assumes everyone is an consenting
adult.


Exactly, which is why I'm not asking for __signature__ to be immutable. Who 
knows, despite Larry's skepticism (and mine!), perhaps there is a use-case for 
__signature__ being modified that we haven't thought of yet.


But that's not really the point. It may be that nobody will be stupid enough 
to mangle __signature__, and inspect.getfullargspec becomes redundant. Even 
so, getfullargspec is not doing any harm. We're not *adding* complexity, it's 
already there, and breaking currently working code by deprecating and then 
removing it is not a step we should take lightly. API churn is itself a cost.




[...]

If people really want to keep getullargspec() around then I want to at
least add a note to the function that signature objects exist as an
alternative (but not vice-versa).


+1



--
Steven

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] peps: PEP 422 rewrite to present an idea that a) isn't crazy and b) it turns out

2012-06-06 Thread PJ Eby
On Wed, Jun 6, 2012 at 6:07 PM, Eric Snow wrote:

> On Wed, Jun 6, 2012 at 5:40 AM, nick.coghlan 
> wrote:
> > +
> > +Alternatives
> > +
> > +
>
> Would it be worth also (briefly) rehashing why the class instance
> couldn't be created before the class body is executed*?  It might seem
> like a viable alternative if you haven't looked at how classes get
> created.
>

Backwards compatibility is really the only reason.  So it'll have to wait
till Python 4000.  ;-)

(Still, that approach is in some ways actually better than the current
approach: you don't need a __prepare__, for example.  Actually, if one were
designing a class creation protocol from scratch today, it would probably
be simplest to borrow the __enter__/__exit__ protocol, with __enter__()
returning the namespace to be used for the suite body, and __exit__()
returning a finished class...  or something similar.  Python-ideas stuff,
to be sure, but it could likely be made a whole lot simpler than the
current multitude of hooks, counter-hooks, and extended hooks.)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Terry Reedy

On 6/6/2012 6:38 PM, Steven D'Aprano wrote:


redundant. Even so, getfullargspec is not doing any harm. We're not
*adding* complexity, it's already there, and breaking currently working
code by deprecating and then removing it is not a step we should take
lightly. API churn is itself a cost.


The 3.x versions of idlelib.CallTips.get_argspec() uses
   inspect.formatargspec(*inspect.getfullargspec(fob))
to create the first line of a calltip for Python coded callables 
(functions and (bound) instance methods, including class(.__init__), 
static and class methods and (with pending patch) instance(.__call__)).


Any new class would have to have a identical formatter to replace this. 
I do not quite see the point of deprecating these functions. It seems to 
that a new presentation object should build on top of the existing 
getfullargspec, when it is requested.


I agree with Stephen that building the seldom-needed redundant 
representation upon creation of every function object is a bad idea.


--
Terry Jan Reedy

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Ethan Furman

Yury Selivanov wrote:

We can implement the __eq__ operator though.


+1
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Cross-compiling python and PyQt

2012-06-06 Thread anatoly techtonik
On Wed, Jun 6, 2012 at 12:35 AM, Terry Reedy  wrote:
> On 6/5/2012 4:24 PM, Tarek Sheasha wrote:
>>
>> Hello,
>> I have been working for a long time on cross-compiling python for
>> android I have used projects like:
>> http://code.google.com/p/android-python27/
>>
>> I am stuck in a certain area, when I am cross-compiling python I would
>> like to install SIP and PyQt4 on the cross-compiled python, I have tried
>> all the possible ways I could think of but have had no success. So if
>> you can help me by giving me some guidelines on how to install
>> third-party software for cross-compiled python for android I would be
>> very helpful.
>
>
> This is off-topic for pydev list (which is for development *of* Python
> rather than development *with*). I suggest python-list (post in text only,
> please) or other lists for better help.

Yes. And try PySide - it's been ported to distutils, so if distutils
supports cross-compiling you may have better luck there.
--
anatoly t.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Nick Coghlan
On Thu, Jun 7, 2012 at 8:38 AM, Steven D'Aprano  wrote:
> Brett Cannon wrote:
>> This is also Python, the language that assumes everyone is an consenting
>> adult.
>
>
> Exactly, which is why I'm not asking for __signature__ to be immutable. Who
> knows, despite Larry's skepticism (and mine!), perhaps there is a use-case
> for __signature__ being modified that we haven't thought of yet.
> But that's not really the point. It may be that nobody will be stupid enough
> to mangle __signature__, and inspect.getfullargspec becomes redundant.

I've presented use cases for doing this already. Please stop calling me stupid.

It will make sense to lie in __signature__ any time there are
constraints on a callable object that aren't accurately reflected in
its Python level signature. The simplest example I can think of is a
decorator that passes extra arguments in to the underlying function on
every call. For example, here's a more elegant alternative to the
default argument hack that relies on manipulating __signature__ to
avoid breaking introspection:

def shared_vars(*shared_args):
"""Decorator factory that defines shared variables that are
passed to every invocation of the function"""
def decorator(f):
@functools.wraps(f) # Sets wrapper.__signature__ to Signature(f)
def wrapper(*args, **kwds):
full_args = shared_args + args
return f(*full_args, **kwds)
# When using this decorator, the public signature isn't
the same as that
# provided by the underlying function, as the first few positional
# arguments are provided by the decorator
sig = wrapper.__signature__
for __ in shared_args:
sig.popitem()

@shared_vars({})
def example(_state, arg1, arg2, arg3):
# _state is for private communication between "shared_vars"
and the function
# callers can't set it, and never see it (unless they dig into
example.__wrapped__)

This has always been possible, but it's been a bad idea because of the
way it breaks pydoc (including help(example)) and other automatic
documentation tools. With a writable __signature__ attribute it
becomes possible to have our cake and eat it too.

>> If people really want to keep getullargspec() around then I want to at
>> least add a note to the function that signature objects exist as an
>> alternative (but not vice-versa).
>
> +1

Also +1, since inspect.getfullargspec() and inspect.signature()
operate at different levels in order to answer different questions.
The former asks "what is the *actual* signature", while the latter
provides a way to ask "what is the *effective* signature".

That's why I see the PEP as more than just a way to more easily
introspect function signatures: the ability to set a __signature__
attribute and have the inspect module pay attention to it means it
becomes possible to cleanly advertise the signature of callables that
aren't actual functions, and *also* possible to derive a new signature
from an existing one, *without needing to care about the details of
that existing signature* (as in the example above, it's only necessary
to know how the signature will *change*).

Cheers,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Eric Snow
On Wed, Jun 6, 2012 at 11:28 AM, Yury Selivanov  wrote:
> Never copy attributes by hand, always use 'functools.wraps'.  It copies
> '__name__', '__qualname__', and bunch of other attributes to the decorator
> object.
>
> We'll probably extend it to copy __signature__ too; then 'signature(decor(f))'
> will be the same as 'signature(f)'.

Having the signature object stored on a function is useful for the
cases like this, where the signature object is *explicitly* set to
differ from the function's actual signature.  That's a good reason to
have inspect.signature(f) look for f.__signature__ and use it if
available.

However, I'm not seeing how the other proposed purpose, as a cache for
inspect.signature(), is useful.  I'd expect that if someone wants a
function's signature, they'd call inspect.signature() for it.  If they
really need the speed of a cache then they can *explicitly* assign
__signature__.

Furthermore, using __signature__ as a cache may even cause problems.
If the Signature object is cached then any changes to the function
will not be reflected in the Signature object.  Certainly that's an
unlikely case, but it is a real case. If f.__signature__ is set, I'd
expect it to be either an explicitly set value or exactly the same as
the first time inspect.signature() was called for that function.  We
could make promises about that and do dynamic updates, etc., but it's
not useful enough to go to the trouble.  And without the guarantees, I
don't think using it as a cache is a good idea.  (And like I said,
allowing/using an explicitly set f.__signature__ is a good thing).

-eric
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Nick Coghlan
On Thu, Jun 7, 2012 at 10:52 AM, Eric Snow  wrote:
> Furthermore, using __signature__ as a cache may even cause problems.
> If the Signature object is cached then any changes to the function
> will not be reflected in the Signature object.  Certainly that's an
> unlikely case, but it is a real case. If f.__signature__ is set, I'd
> expect it to be either an explicitly set value or exactly the same as
> the first time inspect.signature() was called for that function.  We
> could make promises about that and do dynamic updates, etc., but it's
> not useful enough to go to the trouble.  And without the guarantees, I
> don't think using it as a cache is a good idea.  (And like I said,
> allowing/using an explicitly set f.__signature__ is a good thing).

+1

Providing a defined mechanism to declare a public signature is good,
but using that mechanism for implicit caching seems like a
questionable idea. Even when it *is* cached, I'd be happier if
inspect.signature() returned a copy rather than a direct reference to
the original.

Cheers,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Yury Selivanov
On 2012-06-06, at 9:00 PM, Nick Coghlan wrote:
> On Thu, Jun 7, 2012 at 10:52 AM, Eric Snow  
> wrote:
>> Furthermore, using __signature__ as a cache may even cause problems.
>> If the Signature object is cached then any changes to the function
>> will not be reflected in the Signature object.  Certainly that's an
>> unlikely case, but it is a real case. If f.__signature__ is set, I'd
>> expect it to be either an explicitly set value or exactly the same as
>> the first time inspect.signature() was called for that function.  We
>> could make promises about that and do dynamic updates, etc., but it's
>> not useful enough to go to the trouble.  And without the guarantees, I
>> don't think using it as a cache is a good idea.  (And like I said,
>> allowing/using an explicitly set f.__signature__ is a good thing).
> 
> +1
> 
> Providing a defined mechanism to declare a public signature is good,
> but using that mechanism for implicit caching seems like a
> questionable idea. Even when it *is* cached, I'd be happier if
> inspect.signature() returned a copy rather than a direct reference to
> the original.

I'm leaning towards this too.  Besides, constructing a Signature
object isn't an expensive operation.

So, the idea for the 'signature(obj)' function is to first check if
'obj' has '__signature__' attribute set, if yes - return it, if no - 
create a new one (but don't cache).

I have a question about fixing 'functools.wraps()' - I'm not sure
we need to.  I see two solutions to the problem:

I) We fix 'functools.wraps' to do:

   'wrapper.__signature__ = signature(wrapped)'

II) We modify 'signature(obj)' function to do the following steps:

   1. check if obj has '__signature__' attribute. If yes - return it.

   2. check if obj has '__wrapped__' attribute.  If yes:
   obj = obj.__wrapped__; goto 1.

   3. Calculate new signature for obj and return it.

I think that the second (II) approach is better, as we don't
implicitly cache anything, and we don't calculate Signatures
on each 'functools.wraps' call.

-
Yury
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Daniel Urban
On Wed, Jun 6, 2012 at 10:10 PM, Yury Selivanov  wrote:
> On 2012-06-06, at 3:33 PM, Daniel Urban wrote:
>> On Wed, Jun 6, 2012 at 8:35 PM, Yury Selivanov  
>> wrote:
>>> On 2012-06-06, at 2:22 PM, Daniel Urban wrote:
> I'll try to answer you with the following code:
>
>   >>> def foo(*args):
>   ...    print(args)
>
>   >>> bound_args = signature(foo).bind(1, 2, 3)
>   >>> bound_args.arguments
>   OrderedDict([('args', (1, 2, 3))])
>
> You can't invoke 'foo' by:
>
>   >>> foo(**bound_args.arguments)
>   TypeError: foo() got an unexpected keyword argument 'args'

 Of course, but you can invoke it with "1, 2, 3", the arguments you
 used to create the BoundArguments instance in the first place: foo(1,
 2, 3) will work fine.
>>>
>>> The whole point is to use BoundArguments mapping for invocation.
>>> See Nick's idea to validate callbacks, and my response to him, below in
>>> the thread.
>>>
> That's why we have two dynamic properties 'args', and 'kwargs':

 Ok, but what I'm saying is, that we don't really need them.
>>>
>>> We need them.  Again, in some contexts you don't have the arguments
>>> you've passed to bind().
>>
>> But how could we *need* bind to return 'args' and 'kwargs' to us, when
>> we wouldn't be able to call bind in the first place, if we wouldn't
>> had the arguments?
>
> You're missing the point.  BoundArguments contains properly mapped
> *args and **kwargs passed to Signature.bind.  You can validate them after,
> do type casts, modify them, overwrite etc. by manipulating
> 'BoundArguments.arguments'.
>
> At the end you can't, however, invoke the function by doing:
>
>   func(**bound_arguments.arguments) # <- this won't work
>
> as varargs will be screwed.
>
> That's why you need 'args' & 'kwargs' properties on BoundArguments.
>
> Imagine, that "Annotation Checker" example is modified to coerce all string
> arguments to int (those that had 'int' in annotation) and then to multiply
> them by 42.
>
> We'd write the following code:
>
>   for arg_name, arg_value in bound_arguments.arguments.items():
>      # I'm skipping is_args & is_kwargs checks, and assuming
>      # we have annotations everywhere
>      if sig.parameters[arg_name].annotation is int \
>                                and isinstance(arg_value, str):
>          bound_arguments.arguments[arg_name] = int(arg_value) * 42
>
>   return func(*bound_arguments.args, **bound_arguments.kwargs)

I see. Thanks, this modifying example is the first convincing use case
I hear. Maybe it would be good to mention something like this in the
PEP.


Daniel
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updated PEP 362 (Function Signature Object)

2012-06-06 Thread Nick Coghlan
On Thu, Jun 7, 2012 at 11:16 AM, Yury Selivanov  wrote:
> On 2012-06-06, at 9:00 PM, Nick Coghlan wrote:
> So, the idea for the 'signature(obj)' function is to first check if
> 'obj' has '__signature__' attribute set, if yes - return it, if no -
> create a new one (but don't cache).

I'd say return a copy in the first case to be safe against accidental
modification. If someone actually wants in-place modification, they
can access __signature__ directly.

> I have a question about fixing 'functools.wraps()' - I'm not sure
> we need to.  I see two solutions to the problem:
>
> I) We fix 'functools.wraps' to do:
>
>   'wrapper.__signature__ = signature(wrapped)'
>
> II) We modify 'signature(obj)' function to do the following steps:
>
>   1. check if obj has '__signature__' attribute. If yes - return it.
>
>   2. check if obj has '__wrapped__' attribute.  If yes:
>   obj = obj.__wrapped__; goto 1.
>
>   3. Calculate new signature for obj and return it.
>
> I think that the second (II) approach is better, as we don't
> implicitly cache anything, and we don't calculate Signatures
> on each 'functools.wraps' call.

Oh, nice, I like it. Then the wrapped function only gets its own
signature attribute if it's actually being changed by one of the
wrappers and my example would become:

   def shared_vars(*shared_args):
   """Decorator factory that defines shared variables that are
passed to every invocation of the function"""
   def decorator(f):
   @functools.wraps(f)
   def wrapper(*args, **kwds):
   full_args = shared_args + args
   return f(*full_args, **kwds)
   # Override signature
   sig = wrapper.__signature__ = inspect.signature(f)
   for __ in shared_args:
   sig.popitem()

   @shared_vars({})
   def example(_state, arg1, arg2, arg3):
   # _state is for private communication between "shared_vars"
and the function
   # callers can't set it, and never see it (unless they dig into
example.__wrapped__)

Bonus: without implicit signature copying in functools, you can stick
with the plan of exposing everything via the inspect module.

We should still look into making whatever tweaks are needed to let
inspect.signature correctly handle functools.partial objects, though.

Cheers,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com