> From: "Brian Goetz" <brian.go...@oracle.com>
> To: "amber-spec-experts" <amber-spec-experts@openjdk.java.net>
> Sent: Friday, February 25, 2022 10:45:44 PM
> Subject: Primitive type patterns

> As a consequence of doing record patterns, we also grapple with primitive type
> patterns. Until now, we've only supported reference type patterns, which are
> simple:

> - A reference type pattern `T t` is applicable to a match target of type M if 
> M
> can be cast to T without an unchecked warning.

> - A reference type pattern `T t` covers a match type M iff M <: T

> - A reference type pattern `T t` matches a value m of type M if M <: T || m
> instanceof T

> Two of these three characterizations are static computations (applicability 
> and
> coverage); the third is a runtime test (matching). For each kind of pattern, 
> we
> have to define all three of these.

> #### Primitive type patterns in records

> Record patterns necessitate the ability to write type patterns for any type 
> that
> can be a record component. If we have:

> record IntBox(int i) { }

> then we want to be able to write:

> case IntBox(int i):

> which means we need to be able to express type patterns for primitive types.

> #### Relationship with assignment context

> There is another constraint on primitive type patterns: the let/bind statement
> coming down the road. Because a type pattern looks (not accidentally) like a
> local variable declaration, a let/bind we will want to align the semantics of
> "local variable declaration with initializer" and "let/bind with total type
> pattern". Concretely:

> let String s = "foo";

> is a pattern match against the (total) pattern `String s`, which introduces 
> `s`
> into the remainder of the block. Since let/bind is a generalization of local
> variable declaration with initialization, let/bind should align with locals
> where the two can express the same thing. This means that the set of
> conversions allowed in assignment context (JLS 5.2) should also be supported 
> by
> type patterns.

> Of the conversions supported by 5.2, the only one that applies when both the
> initializer and local variable are of reference type is "widening reference",
> which the above match semantics (`T t` matches `m` when `M <: T`) support. So
> we need to fill in the other three boxes of the 2x2 matrix of { ref, primitive
> } x { ref, primitive }.
That's a huge leap, let's take a step back. 

I see two questions that should be answered first. 
1) do we really want pattern in case of assignment/declaration to support 
assignment conversions ? 
2) do we want patterns used by the switch or instanceof to follow the exact 
same rules as patterns used in assignment/declaration ? 

For 1, given that we are using pattern to do destructured assignment, we may 
want to simplify the assignment rules to keep things simple avoid users 
shooting themselves in the foot with implicit unboxing. 
With an example, 
record Box<T>(T value) {} 
Box<Integer> box = ... 
Box<>(int result) = box; // assignment of result may throw a NPE 

I don't think we have to support that implicit unboxing given that we have a 
way to ask for an unboxing explicitly (once java.lang.Integer have a 
de-constructor) 

Box<>(Integer(int result)) = box; 

I think we should not jump with the shark too soon here and ask ourselves if we 
really want assignment conversions in case of destructured assignment. 

2) we already know that depending on the context (inside a switch, inside a 
instanceof, inside an assignment) the rules for pattern are not exactly the 
same. 
So we may consider that in the assignment context, assignment conversions apply 
while for a matching context, simpler rules apply. 
Given that the syntax for switch reuse '->', i believe we should use the 
overriding rules (the one we use for lambdas) instead of the assignment rules 
(the one we use for method reference). 
And yes, i know that the '->' of switch is not the same as the '->' of lambda, 
but i don't think we should bother users to intuitively think that the same 
rules apply. 

Then the model you propose is too clever for me, the fact that 
instanceof Point(double x, double y) 
has a different meaning depending if Point is declared like 
record Point(double x, double y) { } 
or like this 
record Point(Double x, Double y) { } 
is too much. 

The semantics of Java around null is already a giant landmine field, we should 
restraint ourselves to add more null-funny behaviors. 

regards, 
RĂ©mi 

Reply via email to