While discussing with colleagues, someone said:

However, my main gripe is about not supporting in String representation of
> an integer what is supported in its literal representation.
> Thus, Integer x = 1_000_000; is valid, whereas
> Integer.valueOf("1_000_000") is not. That sucks.


It seems to me that this is a reasonable expectation and has practical
benefits (e.g. accepting program arguments that are integers with _'s).

Supporting underscores in number literals (beginning JDK 7) was meant for
readability of the
​ ​
Java source
​ ​
code.
​Perhaps doing this correctly incurs unwarranted implementation complexity
in the JDK.​
As library writers however, how would you explain this mismatch?
​ ​
Was this side effect
​(arguably so) ​
considered at all?

Regards,
Kedar

Reply via email to