>> On Jan 1, 2020, at 04:21, Steven D'Aprano <st...@pearwood.info> wrote: >> >> On Tue, Dec 31, 2019 at 05:18:59PM -0800, Andrew Barnert via Python-ideas >> wrote: >> >> Some languages use special syntax to mark either values or targets: >> let x, K, let z = vec >> x, @K, z = vec >> But the simplest solution is to nothing: you have to stick it in an >> expression that isn’t a valid target, or >> It’s a target. And I think that might actually work. If the pattern matching >> library includes this (or you write it yourself): >> def val(x): return x >> … then you just write this: >> x, val(K), z = vec >> Which doesn’t seem too bad. > > That is a bug-magnet. It's such a powerful bug magnet, that the mere > suggestion caused me to commit three bugs in a script I was writing > *yesterday*, that's how powerful it is. > > But seriously, and all hyperbole aside, that does seem too bad to me. > It's a syntactic construct which will sucker the user (i.e. me) into a > sense of false security, then bite them when they least expect it. Match > constructs seem to work: > > # constants are fine > spam, 42, eggs = obj > > # function calls and expressions are fine > spam, foo(arg), eggs = obj > spam, myconstant or fallback, eggs = obj > > and then you write this: > > spam, db['key'], eggs = obj > > > and the feces hits the rotary air circulating machine. > > I'd almost rather have a rule that says you can only use literals for > the pattern matching than syntax which is so prone to laying landmines.
The first thing people ask for is “How do I match against a value in a local variable?” (As Greg Ewing and someone else did on this thread.) It actually isn’t needed as often as people who don’t do a lot of pattern matching suspect, but it is certainly needed sometimes, so not having any way to allow it would be a serious flaw. However, there are multiple ways to handle this. Adding special syntax that’s always required for marking values works. There’s some precedent for prefix @ for values, and someone already suggested it on this thread: `x, @K, v = …` matches the value of K, and there’s no possibility of ambiguity. Anything without an @ is a target (and if it’s not a valid target, it’s a syntax error). Requiring marking the other way works too, but I don’t think it fits into Python as well. (Typically you use something based on the declaration/binding syntax, but that’s implicit in Python…) Loosening this up to allow literals to be values unmarked, but not other values, works. (Although then you need to work out which definition of “literal” you want. Does -3 have to get marked because it’s actually an expression? What about a list display? And so on. All of these things can be answered, and bikeshedded.) Loosening it to allow calls to be values without being marked already means you can get rid of the marking syntax entirely, by just having a wrapper function to serve as the marker. Loosening it all the way to anything that isn’t syntactically a target seemed like a good idea to me, mainly because it’s the simplest possible rule to keep in your head once you’ve learned it. But it isn’t a necessary part of any proposal. Something that requires more val() calls or even @ marks will still look fundamentally similar. >> And notice that any object with a custom __eq__ will be matched by >> calling that, so your library can have a decorator that turns an >> arbitrary function into a matcher > > I'm sorry, I don't understand what that means. Can you give an example? I already gave multiple examples of custom marchers. I’ll repeat the shortest one here; if you want other examples (including a wrapper that turns any function into a matcher), read the thread. David Mertz asked for a way to match a vector only if the first value is an int: # In the library class MatchSubclass(type): def __eq__(self, other): return isinstance(other, self) # In user code MatchSubclass(int), y, z = vec > > [...] >>> At that >>> point you might be better off with a dedicated "switch" or "case" >>> construct. >> Sure, if we’re willing to use up one or two new keywords and design a >> whole new syntax for them we can obviously make it do whatever we >> want. > > If something is worth doing, it's worth doing right. Sure, but who says switch and case is “right”, much less weird doubly-indented things that are not statements despite ending in colons and that (unlike except, else, etc.) don’t line up with the start of the statement? That was the best anyone had come up with in the 70s, but plenty of newer languages (Swift, Scala, C#, etc.) build their pattern matching out of a match primitive that’s closer to Python’s Iterable unpacking than it is to SML or C, and I think they’re better languages for it. So if Python could do the same thing, why shouldn’t we explore it? It may turn out not to work, but that doesn’t imply that ignoring the last 4 decades of language design because you already know it’s doable the 70s way is a good answer. Meanwhile, that wouldn’t actually solve the problem you’re focusing on. A case statement still needs a way to both match values and destructure to bindings. If that way isn’t at all related to unpacking, that means two similar syntaxes for similar purposes that you need to learn. If it is, then you need to resolve the ambiguity, and you’ve got the same problem with the same possible solutions that we’re already dealing with. _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/BQDUBMI32EOW5OYNI56NO3RC7WHXHV4I/ Code of Conduct: http://python.org/psf/codeofconduct/