On 9/6/16, 12:58 PM, Charles Oliver Nutter wrote:
On Tue, Sep 6, 2016 at 1:04 PM, Xueming Shen <xueming.s...@oracle.com <mailto:xueming.s...@oracle.com>> wrote:

    Yes, it's a known "limit" given the nature of the approach. It is
    not considered
    to be an "incompatible change",  because the max length the String
    class and
    the corresponding buffer/builder classes can support is really an
    implementation
    details, not a spec requirement. The conclusion from the
    discussion back then
    was this is  something we can trade off for the benefits we gain
    from the approach.
    Do we have a real use case that impacted by this change?

Well, doesn't this mean that any code out there consuming String data that's longer than Integer.MAX_VALUE / 2 will suddenly start failing on OpenJDK 9?

Yes, true. But arguably the code that uses huge length of String should have
fallback code to handle the potential OOM exception, when the vm can't handle the size, as there is really no guarantee the vm can handle the > max_value/2
length of String.

Not that such a case is a particularly good pattern, but I'm sure there's code out there doing it. On JRuby we routinely get bug reports complaining that we can't support strings larger than 2GB (and we have used byte[] for strings since 2006).

That was a trade-off decision to make.

Does JRuby have any better solution for such complain? ever consider to go back to use char[] to "fix" the problem? or some workaround such as to add another byte[] for example.

btw, the single byte only string should work just fine :-) or :-( depends on the character set
used.

Sherman

Reply via email to