Re: [Python-Dev] Surely "nullable" is a reasonable name?

2015-04-19 Thread Larry Hastings



On 08/07/2014 09:41 PM, Larry Hastings wrote:
Well!  It's rare that the core dev community is so consistent in its 
opinion.  I still think "nullable" is totally appropriate, but I'll 
change it to "allow_none".


(reviving eight-month-old thread)

In case anybody here is still interested in arguing about this: the 
Clinic API may be shifting a bit here.  What follows is a quick 
refresher course on Argument Clinic, followed by a discussion of the 
proposed new API.


Here's an Argument Clinic declaration of a parameter:
s: str()
The parameter is called "s", and it's specifying a converter function 
called "str" which handles converting string parameters. The str() 
converter itself accepts parameters; since the parameters all have 
default values, they're all optional.  By default, str() maps directly 
to the "s" format unit for PyArg_ParseTuple(), as it does here.


Currently str() (and a couple other converter functions) accepts a 
parameter called "types".  "types" is specified as a string, and 
contains an unordered set of whitespace-separated strings representing 
the Python types of the values this (Clinic) parameter should accept.  
The default value of "types" for str() is "str"; the following 
declaration is equivalent to the declaration above:

s: str(types="str")
Other legal values for the "types" parameter for the str converter 
include "bytes bytearray str" and "robuffer str".  Internally the types 
parameter is converted into a set of strings; passing it in as a string 
is a nicety for the caller's benefit.  (It also means that the strings 
"robuffer str" and "str robuffer" are considered equivalent.)


There's a second parameter, currently called "nullable", but I was 
supposed to rename it "allow_none", so I'll use that name here.  If you 
pass in "allow_none=True" to a converter, it means "this (Clinic) 
parameter should accept the Python value None".  So, to map to the 
format unit "z", you would specify:

  s: str(allow_none=True)

And to map to the format unit "z#", you would specify:
  s: str(types="robuffer str", allow_none=True, length=True)


In hindsight this is all a bit silly.  I propose what I think is a much 
better API below.


We should rename "types" to "accept".  "accept" should takes a set of 
types; these types specify the types of Python objects the Clinic 
parameter should accept.  For the funny pseudo-types needed in some 
Clinic declarations ("buffer", "robuffer", and "rwbuffer"), Clinic 
provides empty class declarations so these behave like types too.


accept={str} is the default for the str() converter.  If you want to map 
to format unit "z", you would write this:

s: str(accept={str, NoneType})
(In case you haven't seen it before: NoneType = type(None).  I don't 
think the name is registered anywhere officially in the standard 
library... but that's the name.)


The upside of this approach:

 * Way, way more obvious to the casual reader.  "types" was always
   meant as an unordered collection of types, but I felt specifying it
   with strings was unwieldy and made for poor reading ({'str',
   'robuffer'}).  Passing it in as a single string which I internally
   split and put in a set() was a bad compromise.  But the semantics of
   this whitespace-delimited string were a bit unclear, even to the
   experienced Clinic hacker.  This set-of-types version maps exactly
   to what the parameter was always meant to accept in the first
   place.  As with any other code, people will read Clinic declarations
   far, far more often than they will write them, so optimizing for
   clarity is paramount.
 * Zen: "There should be one (and preferably only one) obvious way to
   do it." We have a way of specifying the types this parameter should
   accept; "allow_none" adds a second.
 * Zen: "Special cases aren't special enough to break the rules". 
   "allow_none" was really just a special case of one possible type for

   "types".


The downside of this approach:

 * You have to know what the default accept= set is for each
   converter.  Luckily this is not onerous; there are only four
   converters that need an "accept" parameter, and their default values
   are all simple:

int(accept={int})
str(accept={str})
Py_UNICODE(accept={str})
Py_buffer(accept={buffer})

I suggest this is only a (minor) problem when writing a Clinic 
declaration.  It doesn't affect later readability, which is much more 
important.


 * It means repeating yourself a little.  If you just want to say "I
   want to accept None too", you have to redundantly specify the
   default type(s) accepted by the converter function.  In practice,
   it's really only redundant for four or five format units, and
   they're not the frequently-used ones.  Right now I only see three
   uses of nullable for the built-in format units (there are two more
   for my path_converter) and they're all for the str converter.

   Yes, we could create a set containing the defau

[Python-Dev] [Issue 22619] Patch needs a review

2015-04-19 Thread Riley Banks
Greetings.

Can someone review Serhiy's patch for the following issue?

https://bugs.python.org/issue22619

I see Dmitry pinged the issue like 2 months ago, then 1 month later...
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Surely "nullable" is a reasonable name?

2015-04-19 Thread Glenn Linderman

On 4/19/2015 1:19 AM, Larry Hastings wrote:



On 08/07/2014 09:41 PM, Larry Hastings wrote:
Well!  It's rare that the core dev community is so consistent in its 
opinion.  I still think "nullable" is totally appropriate, but I'll 
change it to "allow_none".


(reviving eight-month-old thread)



  * Zen: "There should be one (and preferably only one) obvious way to
do it." We have a way of specifying the types this parameter
should accept; "allow_none" adds a second.
  * Zen: "Special cases aren't special enough to break the rules". 
"allow_none" was really just a special case of one possible type

for "types".




Is argument clinic a special case of type annotations?  (Quoted and 
worded to be provocative, intentionally but not maliciously.)


OK, I know that argument clinic applies to C code and I know that type 
annotations apply to Python code. And I know that C code is a lot more 
restrictive /a priori/ which clinic has to accommodate, and type 
annotations are a way of adding (unenforced) restrictions on Python 
code.  Still, from a 50,000' view, there seems to be an overlap in 
functionality... and both are aimed at Py 3.5... I find that 
interesting... I guess describing parameter types is the latest Python 
trend :)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Surely "nullable" is a reasonable name?

2015-04-19 Thread Larry Hastings

On 04/19/2015 01:26 PM, Glenn Linderman wrote:
Is argument clinic a special case of type annotations?  (Quoted and 
worded to be provocative, intentionally but not maliciously.)


OK, I know that argument clinic applies to C code and I know that type 
annotations apply to Python code. And I know that C code is a lot more 
restrictive /a priori/ which clinic has to accommodate, and type 
annotations are a way of adding (unenforced) restrictions on Python 
code.  Still, from a 50,000' view, there seems to be an overlap in 
functionality... and both are aimed at Py 3.5... I find that 
interesting... I guess describing parameter types is the latest Python 
trend :)


Argument Clinic and Python 3 type annotations are related concepts. 
Argument Clinic's syntax is designed in such a way that we actually use 
ast.parse() to parse it, and that includes using the type annotation 
syntax.  That's about all they have in common.


This discussion is off-topic and of limited interest; if you have 
further questions along these lines please email me privately.



//arry/
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Should instances really be able to dictate the "existence" of special methods?

2015-04-19 Thread Eric Snow
_PyObject_LookupSpecial is used in place of obj.__getattribute__ for
looking up special methods.  (As far as I recall it is not exposed in
the stdlib, e.g. inspect.getattr_special.)  Correct me if I'm wrong
(please!), but there are two key reasons:

 * access to special methods in spite of obj.__getattribute__
 * speed

While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or
call obj.__getattr__, it does resolve descriptors.  This is important
particularly since special methods will nearly always be some kind of
descriptor.  However, one consequence of this is that instances can
influence whether or not some capability, as relates to the special
method, is available.  This is accomplished by the descriptor's
__get__ raising AttributeError.

My question is: was this intentional?  Considering the obscure bugs
that can result (e.g. where did the AttributeError come from?), it
seems more likely that it is an oversight of an obscure corner case.
If that is the case then it would be nice if we could fix
_PyObject_LookupSpecial to chain any AttributeError coming from
descr.__get__ into a RuntimeError.  However, I doubt we could get away
with that at this point.

Also, while it may be appropriate in general to allow instances to
dictate the availability of attributes/methods (e.g. through
__getattribute__, __getattr__, or descriptors), I'm not convinced it
makes sense for special methods.  We are already explicitly
disconnecting special methods from instances in
_PyObject_LookupSpecial (except in the case of descriptors...).

-eric

p.s. I also find it a bit strange that instances have any say at all
in which methods (i.e. behavior) are *available*.  Certainly instances
influence behavior, but I always find their impact on method
availability to be surprising.  Conceptually for me instances are all
about state and classes about behavior (driven by state).  However, it
is very rarely that I run into code that takes advantage of the
opportunity. :)
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Should instances really be able to dictate the "existence" of special methods?

2015-04-19 Thread Guido van Rossum
(I suppose this new thread is a result of some research you did regarding
the thread complaining about callable()?)

On Sun, Apr 19, 2015 at 4:03 PM, Eric Snow 
wrote:

> _PyObject_LookupSpecial is used in place of obj.__getattribute__ for
> looking up special methods.  (As far as I recall it is not exposed in
> the stdlib, e.g. inspect.getattr_special.)  Correct me if I'm wrong
> (please!), but there are two key reasons:
>
>  * access to special methods in spite of obj.__getattribute__
>  * speed
>

Good question! I don't have an easy pointer to the original discussion, but
I do recall that this was introduced in response to some issues with the
original behavior, which looked up dunder methods on the instance and
relied on the general mechanism for binding it to the instance. I don't
think the reason was to circumvent __getattribute__, but your second bullet
rings true: for every +, -, * etc. there would be a (usually failing)
lookup in the instance dict before searching the class dict and then the
base classes etc. There may also have been some confusion where people
would e.g. assign a function of two arguments to x.__add__ and would be
disappointed to find out it was called with only one argument. I think
there were some folks who wanted to fix this by somehow "binding" such
calls to the instance (since there's no easy way otherwise to get the first
argument) but I thought the use case was sufficiently odd that it was
better to avoid it altogether.

In any case, it's not just an optimization -- it's an intentional (though
obscure) feature.


> While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or
> call obj.__getattr__, it does resolve descriptors.  This is important
> particularly since special methods will nearly always be some kind of
> descriptor.  However, one consequence of this is that instances can
> influence whether or not some capability, as relates to the special
> method, is available.  This is accomplished by the descriptor's
> __get__ raising AttributeError.
>

Well, it's not really the instance that raises AttributeError -- it's the
descriptor, which is a separate class (usually but not always a builtin
class, such as property or classmethod). And the descriptor is "owned" by
the class.


> My question is: was this intentional?  Considering the obscure bugs
> that can result (e.g. where did the AttributeError come from?), it
> seems more likely that it is an oversight of an obscure corner case.
>

I'm not sure what you would do to avoid this. You can't very well declare
that a descriptor's __get__ method must not raise AttributeError. It could
be implemented in Python and it could just hit a bug or something. But
perhaps I'm misunderstanding the situation you're describing?


> If that is the case then it would be nice if we could fix
> _PyObject_LookupSpecial to chain any AttributeError coming from
> descr.__get__ into a RuntimeError.  However, I doubt we could get away
> with that at this point.
>

Yeah, I think that ship has sailed. It also seems to be hardly worth trying
to control "double fault" situations like this. (It's not really a double
fault, but it reeks like it.)

I wonder if maybe you're feeling inspired by PEP 479? But that's really a
much more special case, and I don't really want to start down a whole
cascade of trying to "fix" all cases where an AttributeError could be
raised due to a problem in the user's lookup code.


> Also, while it may be appropriate in general to allow instances to
> dictate the availability of attributes/methods (e.g. through
> __getattribute__, __getattr__, or descriptors), I'm not convinced it
> makes sense for special methods.  We are already explicitly
> disconnecting special methods from instances in
> _PyObject_LookupSpecial (except in the case of descriptors...).
>

I'm still a little bit confused why you consider an error from the
descriptor as "dictated by the instance". I think what you're trying to
describe is that there is a method on the class but trying to bind it to
the instance fails. Well, all sorts of things may fails. (In fact very few
things cannot raise an exception in Python.)


> -eric
>
> p.s. I also find it a bit strange that instances have any say at all
> in which methods (i.e. behavior) are *available*.  Certainly instances
> influence behavior, but I always find their impact on method
> availability to be surprising.  Conceptually for me instances are all
> about state and classes about behavior (driven by state).  However, it
> is very rarely that I run into code that takes advantage of the
> opportunity. :)
>

If I understand what you're trying to say, what you're describing is due to
Python's unification of instance variables and methods into attributes.
It's pretty powerful that if x.foo(args) is a method call, you can also
write this as (x.foo)(args), and you can separate the attribute access even
further from the call and pass x.foo to some other function that is
eventually going to call it. La

Re: [Python-Dev] Should instances really be able to dictate the "existence" of special methods?

2015-04-19 Thread Eric Snow
On Mon, Apr 20, 2015 at 2:20 AM, Guido van Rossum  wrote:
> (I suppose this new thread is a result of some research you did regarding
> the thread complaining about callable()?)

Yep. :)

> On Sun, Apr 19, 2015 at 4:03 PM, Eric Snow 
> wrote:
>>
>> _PyObject_LookupSpecial is used in place of obj.__getattribute__ for
>> looking up special methods.  (As far as I recall it is not exposed in
>> the stdlib, e.g. inspect.getattr_special.)  Correct me if I'm wrong
>> (please!), but there are two key reasons:
>>
>>  * access to special methods in spite of obj.__getattribute__
>>  * speed
>
> Good question! I don't have an easy pointer to the original discussion, but
> I do recall that this was introduced in response to some issues with the
> original behavior, which looked up dunder methods on the instance and relied
> on the general mechanism for binding it to the instance. I don't think the
> reason was to circumvent __getattribute__, but your second bullet rings
> true: for every +, -, * etc. there would be a (usually failing) lookup in
> the instance dict before searching the class dict and then the base classes
> etc. There may also have been some confusion where people would e.g. assign
> a function of two arguments to x.__add__ and would be disappointed to find
> out it was called with only one argument. I think there were some folks who
> wanted to fix this by somehow "binding" such calls to the instance (since
> there's no easy way otherwise to get the first argument) but I thought the
> use case was sufficiently odd that it was better to avoid it altogether.
>
> In any case, it's not just an optimization -- it's an intentional (though
> obscure) feature.

Thanks for explaining.

>> While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or
>> call obj.__getattr__, it does resolve descriptors.  This is important
>> particularly since special methods will nearly always be some kind of
>> descriptor.  However, one consequence of this is that instances can
>> influence whether or not some capability, as relates to the special
>> method, is available.  This is accomplished by the descriptor's
>> __get__ raising AttributeError.
>
> Well, it's not really the instance that raises AttributeError -- it's the
> descriptor, which is a separate class (usually but not always a builtin
> class, such as property or classmethod). And the descriptor is "owned" by
> the class.

Sure.  That's what I meant. :)  The instance can influence what the
descriptor returns.

>> My question is: was this intentional?  Considering the obscure bugs
>> that can result (e.g. where did the AttributeError come from?), it
>> seems more likely that it is an oversight of an obscure corner case.
>
> I'm not sure what you would do to avoid this. You can't very well declare
> that a descriptor's __get__ method must not raise AttributeError. It could
> be implemented in Python and it could just hit a bug or something.

Right.  And such a bug will be misinterpreted and obscured and hard to
unravel.  I ran into this a while back with pickle (which still does
lookup for special methods on the instance).  Ultimately it's the same
old problem of not knowing how to interpret an exception that may have
bubbled up from some other layer.

Like I said, I don't think there's anything to be done about it either
way.  I just got the feeling that in the case of special methods, the
descriptor part of lookup should not expect AttributeError to come out
of the getter.  So I wanted to see if my intuition was correct even if
the point is essentially irrelevant. :)  At this point, though, I
think my intuition wasn't quite right, though I still don't think a
descriptor's getter is the right place to raise AttributeError.

> But
> perhaps I'm misunderstanding the situation you're describing?
>
>>
>> If that is the case then it would be nice if we could fix
>> _PyObject_LookupSpecial to chain any AttributeError coming from
>> descr.__get__ into a RuntimeError.  However, I doubt we could get away
>> with that at this point.
>
> Yeah, I think that ship has sailed. It also seems to be hardly worth trying
> to control "double fault" situations like this. (It's not really a double
> fault, but it reeks like it.)
>
> I wonder if maybe you're feeling inspired by PEP 479? But that's really a
> much more special case, and I don't really want to start down a whole
> cascade of trying to "fix" all cases where an AttributeError could be raised
> due to a problem in the user's lookup code.

Nah.  It isn't about fixing all the cases nor directly related to PEP
479.  Instead it is in response to one obscure corner case (the
behavior of callable).

>> Also, while it may be appropriate in general to allow instances to
>> dictate the availability of attributes/methods (e.g. through
>> __getattribute__, __getattr__, or descriptors), I'm not convinced it
>> makes sense for special methods.  We are already explicitly
>> disconnecting special methods from instances in
>> _PyObject_

Re: [Python-Dev] Should instances really be able to dictate the "existence" of special methods?

2015-04-19 Thread Guido van Rossum
OK, so I think there isn't anything we can or should do here. Yes, it's
possible that type(x).__add__ succeeds but x.__add__ fails. That's how you
spell descriptor. :-) You could also use a random number generator in
__getattribube__...

On Sun, Apr 19, 2015 at 6:36 PM, Eric Snow 
wrote:

> On Mon, Apr 20, 2015 at 2:20 AM, Guido van Rossum 
> wrote:
> > (I suppose this new thread is a result of some research you did regarding
> > the thread complaining about callable()?)
>
> Yep. :)
>
> > On Sun, Apr 19, 2015 at 4:03 PM, Eric Snow 
> > wrote:
> >>
> >> _PyObject_LookupSpecial is used in place of obj.__getattribute__ for
> >> looking up special methods.  (As far as I recall it is not exposed in
> >> the stdlib, e.g. inspect.getattr_special.)  Correct me if I'm wrong
> >> (please!), but there are two key reasons:
> >>
> >>  * access to special methods in spite of obj.__getattribute__
> >>  * speed
> >
> > Good question! I don't have an easy pointer to the original discussion,
> but
> > I do recall that this was introduced in response to some issues with the
> > original behavior, which looked up dunder methods on the instance and
> relied
> > on the general mechanism for binding it to the instance. I don't think
> the
> > reason was to circumvent __getattribute__, but your second bullet rings
> > true: for every +, -, * etc. there would be a (usually failing) lookup in
> > the instance dict before searching the class dict and then the base
> classes
> > etc. There may also have been some confusion where people would e.g.
> assign
> > a function of two arguments to x.__add__ and would be disappointed to
> find
> > out it was called with only one argument. I think there were some folks
> who
> > wanted to fix this by somehow "binding" such calls to the instance (since
> > there's no easy way otherwise to get the first argument) but I thought
> the
> > use case was sufficiently odd that it was better to avoid it altogether.
> >
> > In any case, it's not just an optimization -- it's an intentional (though
> > obscure) feature.
>
> Thanks for explaining.
>
> >> While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or
> >> call obj.__getattr__, it does resolve descriptors.  This is important
> >> particularly since special methods will nearly always be some kind of
> >> descriptor.  However, one consequence of this is that instances can
> >> influence whether or not some capability, as relates to the special
> >> method, is available.  This is accomplished by the descriptor's
> >> __get__ raising AttributeError.
> >
> > Well, it's not really the instance that raises AttributeError -- it's the
> > descriptor, which is a separate class (usually but not always a builtin
> > class, such as property or classmethod). And the descriptor is "owned" by
> > the class.
>
> Sure.  That's what I meant. :)  The instance can influence what the
> descriptor returns.
>
> >> My question is: was this intentional?  Considering the obscure bugs
> >> that can result (e.g. where did the AttributeError come from?), it
> >> seems more likely that it is an oversight of an obscure corner case.
> >
> > I'm not sure what you would do to avoid this. You can't very well declare
> > that a descriptor's __get__ method must not raise AttributeError. It
> could
> > be implemented in Python and it could just hit a bug or something.
>
> Right.  And such a bug will be misinterpreted and obscured and hard to
> unravel.  I ran into this a while back with pickle (which still does
> lookup for special methods on the instance).  Ultimately it's the same
> old problem of not knowing how to interpret an exception that may have
> bubbled up from some other layer.
>
> Like I said, I don't think there's anything to be done about it either
> way.  I just got the feeling that in the case of special methods, the
> descriptor part of lookup should not expect AttributeError to come out
> of the getter.  So I wanted to see if my intuition was correct even if
> the point is essentially irrelevant. :)  At this point, though, I
> think my intuition wasn't quite right, though I still don't think a
> descriptor's getter is the right place to raise AttributeError.
>
> > But
> > perhaps I'm misunderstanding the situation you're describing?
> >
> >>
> >> If that is the case then it would be nice if we could fix
> >> _PyObject_LookupSpecial to chain any AttributeError coming from
> >> descr.__get__ into a RuntimeError.  However, I doubt we could get away
> >> with that at this point.
> >
> > Yeah, I think that ship has sailed. It also seems to be hardly worth
> trying
> > to control "double fault" situations like this. (It's not really a double
> > fault, but it reeks like it.)
> >
> > I wonder if maybe you're feeling inspired by PEP 479? But that's really a
> > much more special case, and I don't really want to start down a whole
> > cascade of trying to "fix" all cases where an AttributeError could be
> raised
> > due to a problem in the user's lookup code.
>
> N

Re: [Python-Dev] Should instances really be able to dictate the "existence" of special methods?

2015-04-19 Thread Eric Snow
On Mon, Apr 20, 2015 at 4:37 AM, Guido van Rossum  wrote:
> OK, so I think there isn't anything we can or should do here. Yes, it's
> possible that type(x).__add__ succeeds but x.__add__ fails. That's how you
> spell descriptor. :-) You could also use a random number generator in
> __getattribube__...

Cool.  That's pretty much what I figured.

-eric
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com