Dmitry Nadezhin wrote:
BigInteger is a class implementing arbitrary-precision integers. Of course, it is an idealization. Let's look at implementation limits of this type.

The bits are stored in "int[] mag" array. This sets the maximum bit length 32*(2^31 - 1).

The "BigInteger.magSerializedForm()" converts bits to "byte[]" array. Serialization of BigInteger will throw an exception on values with bit length larger than 8*(2^31 - 1).

The methods "BigInteger.bitLength()", "BigInteger.bitCount()", "BigInteger.getLowestSetBit()" return result as "int". So they may wrap around when bit length is larger than "2^31 - 1". The method "BigInteger.toByteArray()" can fail on such values because it uses "bitLength()".

In fact, the output of this program
public static void main(String[] args) {
BigInteger x = BigInteger.ONE.shiftLeft(Integer.MAX_VALUE); // x = 2^Integer.MAX_VALUE;
System.out.println("bitLength="+x.bitLength());
try {
byte[] bytes = x.toByteArray();
} catch (Exception e) {
e.printStackTrace(System.out);
}
}
is:
bitLength=-2147483648
java.lang.NegativeArraySizeException
at java_math.BigInteger.toByteArray(BigInteger.java:2689)

Yes, the internal bookkeeping of BigInteger is not robust when extremely large values are used.

Shouldn't BigInteger.bitLength()" conform its specification on all possible values of BigInteger ?
This can be achieved in different ways:
a) Don't change code but write about the wrap around in the specification of "bitLength()" and other methods ;
b) Change the return type of "bitLength()" from "int" to "long";

Should we break binary compatibility of a commonly used class by changing the return type to fix this problem? Obviously not.

c) "bitLength()" throws an exception when bit length is larger than 2^31 - 1; d) All BigInteger constructors ensure that the bit length is no larger than 2^31 - 1;

d) is arguably the most correct approach to address the problem. However, I think the practical consequences of this flaw are low.

-Joe

Reply via email to