> On Mar 10, 2021, at 12:04 PM, Brian Goetz <brian.go...@oracle.com> wrote:
> 
> 
> 
>> You nail the guard to a pattern, which is equivalent until we have nested 
>> patterns (and "or"/"and" patterns).
> 
> We have nested patterns already in the JEPs on the table.  Where's the 
> problem?
> 
>> I see a lot of advantages of using && to link a guard to a pattern,
>> - the symbol is heavy so there is a clear visual separation
>> - without any supplementary parenthesis, && after the type pattern in an 
>> instanceofis the && between expression, it's almost like you can not have a 
>> guard with an instanceof, in practice, few instanceof will have a guard.
> 
> I agree few instanceof will have a guard, but users are still free to express 
> it that way if they like, and there's nothing wrong with that.  
> 
>> I still think that using a guard inside a nested pattern is ugly but it can 
>> be just that, ugly. Someone may want a short-circuit in a deeply nested 
>> patterns .
> 
> Yes.  This is not unlike other compositions; for any compositional tool, you 
> can overuse it.  (You can arbitrarily compose boolean expressions (or 
> arbitrarily chain method invocations), but sometimes this is taking it too 
> far.)  
> 
>> As i said to Gavin, i'm not at ease with using the symbol '&' in between 
>> patterns.
> 
> I think that this is mostly a "who moved my cheese" reaction; you're used to 
> thinking that & is just for bitwise operations.  But, that's not actually 
> true; we already use & and | on types -- intersection type casts, additional 
> generic type bounds, multi-catch.  This appeals to a notion that & and | are 
> boolean-like combinators on types (even if not exposed in all places they 
> might make sense), but this is a different kind of combination than on 
> integers.  And yet a different kind of combination on patterns.  (In an 
> alternate universe, we might have different symbols for adding ints vs floats 
> vs string concatenation, but + works well enough that overloading the symbols 
> is OK -- because using + in this way appeals to the underlying algebraic 
> monoid structure these types share.)

We all know that `+` is not your best poster child for this argument.

        (“foo” + 1) + 2   produces   “foo12”
        “foo” + (1 + 2)   produces   “foo3”

which is not my idea of good monoid behavior.  But there is a lot of wiggle 
room in your use of the word “appeals”.  :-)

> The reason that & and | make sense on patterns, and on types, is that, like 
> the more familiar versions on bits, they describe a _boolean algebra_.  
> Boolean algebras have familiar properties such as De Morgan's Laws.  These 
> work for types (when interpreted as value sets) as well as bits, and they 
> work for patterns too.  
> 
> I think where you're getting hung up is that when patterns produce bindings, 
> and other patterns consume those bindings, we have a dataflow dependence 
> which would appear to undermine certain other expected properties of a 
> boolean algebra, such as commutativity.  But, if we view those dataflow 
> dependencies as a separate constraint -- as we *already do* for ints (e.g., 
> `(x=3)&(x|4)`, is invalid when `x` is an DU int, but valid when `x` is DA), 
> this seeming contradiction vanishes, and is seen to be merely a post-hoc 
> well-formedness constraint.  If the WF constraint is satisfied, the expected 
> properties of boolean algebras (associativity, commutativity, absorption, 
> etc) are satisfied too. 

And yet, even if we stipulate all that, it is still the case that in 
expressions, Java uses `&&` to indicate short-circuiting and `&` to indicate no 
reliance on short-circuiting, and this is long-standing, familiar use.  For the 
other applications cited (intersection type casts, additional generic type 
bounds, multi-catch) this distinction does not matter.  But in principle it 
does matter for patterns, because while patterns arguably do not involve 
_evaluation_, they most certainly involve _execution_ of possibly user-written 
code.  If side effects can occur, the distinction arguably matters, and this is 
worth recognizing as we debate the design.

Nevertheless, as I just wrote in an earlier email, I think that the desire to 
maintain this distinction should be outweighed by other considerations (visual 
analogy to type intersection, for which `&` is used but not `&&`; need for 
distinct symbols for pattern conjunction action and guard attachment in order 
to solve parsing problems; desire not to invent an arbitrary new symbol such as 
`&:` or `&&&`).


Reply via email to