It would probably help to spell out the paths through specificity
separately, rather than have everyone try to parse the rules, but it
seems that Integer fits the subclass match *without* a method
invocation conversion, and therefore Object is a better fit. As I said
previously, a match via conversion should be considered of lower
desirability than a match that does not require conversion. That makes
both cases work.
* primitive => primitive outweighs primitive => reference because a
conversion is required (int => int preferred over int => Object or int
=> Integer)
* boxed => object outweighs boxed => primitive because a conversion is
required (Integer => Object or Integer => Integer preferred over
Integer => int)
For my benefit and the benefit of others, let me try to spell it out.
What you're saying by "the input doesn't matter" is that the
specificity tests are done independent of the actual incoming argument
types. A set of matching methods is found based on the incoming types,
and then that pool of methods enters the octagon with hopefully only
one "most specific" winner emerging. That's true. However, going back
to my earlier email, specificity is currently based subclass
relationships. Subclass relationships do not make sense in the context
of primitives, so we're using subclass specificity tests to choose a
method for an incoming argument type that doesn't subclass and doesn't
even live in the type hierarchy we're using to determine specificity.
And if I continue from that understanding, what you mean by "int can't
be both more and less specific than Integer" is that without knowledge
of the actual incoming types, you can't make a rule whether to use the
int signature or the Integer signature in a way consistent with how we
want both int and Integer inputs to work. If you say int > Integer so
that int inputs choose int signatures, then you can't get to say
Integer > int when a later Integer call should call the Integer
signature.
I think Dan Smith has it right when he said we need to insert a phase:
1) Applicable by subtyping
2) Applicable by method invocation
*3) Applicable by subtyping with varargs*
4) Applicable by method invocation with varargs
This allows my original example to resolve the way most here think it
should. It also allows Integer to go to Object as appropriate. It's
not perfect though.
Original example
void method(String s, int n, Object... os);
void method(String s, Object... os);
method("abc", 3);
Resolves to the int, Object... signature as expected. But Dan pointed
out that the same example with Integer would behave differently.
void method(String s, int n, Object... os);
void method(String s, Object... os);
method("abc", new Integer(3));
Currently this is ambiguous since we go immediately to resolution
using conversion *and* varargs. Both int, Object... and Object...
signatures enter the octagon. Under the new four-phase mechanism, it
would resolve to the Object... signature because subtyping + varargs
takes precedence over conversion + varargs, resulting in the Object...
signature entering the octagon at the new phase three.
This is a change, but it's more consistent with the first two
phases...and of course since it was ambiguous before, no code like
this exists in the wild. Given another example that does not hit
varargs:
void method(String s, int n);
void method(String s, Object os);
method("abc", new Integer(3));
Obviously the Object signature is chosen for exactly the same reason:
subclassing rules over conversion. And people expect this. So rather
than making selection inconsistent, it actually makes the varargs
phases more consistent with the non-varargs phases.
Why shouldn't varargs phase(s) obey the same precedence of subclassing
over conversion?
- Charlie
On Thu, Jun 9, 2011 at 5:44 PM, Neal Gafter <[email protected]> wrote:
> The input doesn't matter to Java's "more specific" test.
>
> What if the input is Integer and the choices are int and Object. Do you
> really prefer the NullPointerException-causing unboxing conversion to the
> safe widening reference conversion?
>
> On Thu, Jun 9, 2011 at 3:30 PM, Charles Oliver Nutter <[email protected]>
> wrote:
>>
>> On Thu, Jun 9, 2011 at 5:12 PM, Rémi Forax <[email protected]> wrote:
>> > You can, just consider int and Integer has the same node:
>> > int <= Integer && int >= Integer because int == Integer
>>
>> In addition, I don't see how Integer would ever need to be considered
>> more specific than int, for an int input. In what case does that
>> happen?
>>
>> - Charlie
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "JVM Languages" group.
>> To post to this group, send email to [email protected].
>> To unsubscribe from this group, send email to
>> [email protected].
>> For more options, visit this group at
>> http://groups.google.com/group/jvm-languages?hl=en.
>>
>
> --
> You received this message because you are subscribed to the Google Groups
> "JVM Languages" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/jvm-languages?hl=en.
>
--
You received this message because you are subscribed to the Google Groups "JVM
Languages" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/jvm-languages?hl=en.