On 27/08/20 3:56 pm, Steven D'Aprano wrote:
On Thu, Aug 27, 2020 at 03:28:07AM +1200, Greg Ewing wrote:
>
We're falling back to __getitem__ here, which doesn't currently allow
keywords,

Point of order: **the getitem dunder** already allows keywords, and
always has, and always will. It's just a method.

It's the **subscript (pseudo-)operator** which doesn't support keywords.

Yes, I could have worded that better. What I meant was that no existing
__getitem__ method expects to get keywords given to it via indexing
notation, and under my proposal that wouldn't change.

So if you want to accept keywords, you just add keywords to your
existing dunder method. If you don't want them, don't add them. We don't
need a new dunder just for the sake of keywords.

Nobody disputes that it *could* be made to work that way. But I'm
not convinced that it's the *best* way for it to work. The killer
argument in my mind is what you would have to do to make an object
where all of the following are equivalent:

   a[17, 42]
   a[time = 17, money = 42]
   a[money = 42, time = 17]

With a fresh new dunder, it's dead simple:

   def __getindex__(self, time, money):
      ...

With a __getitem__ that's been enhanced to take keyword args, but
still get positional args packed into a tuple, it's nowhere near
as easy.

But type slots are expensive in other ways. Every new type slot
increases the size of objects, and I've seen proposals for new dunders
knocked back for that reason, so presumably the people care about the
C-level care about the increase in memory and complexity from adding new
type slots.

It doesn't seem like a serious problem to me. Type objects are
typically created once at program startup, and they're already
quite big. If it's really a concern, the new slots could be put
in a substructure, so the type object would only be bigger by
one pointer if they weren't being used.

Another possibility would be to have them share the slots for
the existing methods, with flags indicating which variant is
being used for each one. A given type is only going to need
either __getindex__ or __getitem__, etc., not both.

5. alternatively, we could leave the existing C-level sequence and
    mapping objects alone, and create *four* brand new C-level objects:

    - a sequence object that supports only the new index protocol;
    - a sequence object that supports both index and item protocols;
    - and likewise two new mapping objects.

I don't follow this one. How can there be both old and new
protocols without adding new dunder methods?

Right now, if you call `obj[1,]` the dunder receives the tuple (1,) as
index. If it were treated as function call syntax, that would receive a
single argument 1 instead.

See my earlier post about that.

Another inconsistency: function call syntax looks like this:

     call ::=  primary "(" [argument_list [","] | comprehension] ")"

which means we can write generator comprehensions inside function
calls without additional parentheses

As I said earlier, I don't mind if the contents of the square brackets
don't behave exactly like a function argument list. I expect the use
cases for passing a generator expression as an index to be sufficiently
rare that I won't mind having to put parens around it.

     obj[expr for x in items]    # and this is... what?

I'm fine with it being a syntax error.

--
Greg
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/US2VSRWB63QEQTDWJOY3QSVWQWB5FFWQ/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to