Hi,
I don't know if this has been discussed before. But I was looking at
the HashMap implementation today and noticed that there are some
issues with very large sized hashmaps with more then Integer.MAX_VALUE
elements.
1.
The Map contract says that "If the map contains more than
Integer.MAX_VALUE elements, returns Integer.MAX_VALUE." The current
implementation will just wrap around and return negative
numbers when you add elements (size++).
2.
If the size of a HashMap has wrapped around and returns negative size
you cannot deserialize it. Because of this loop in readObject
for (int i=0; i<size; i++) {
K key = (K) s.readObject();
V value = (V) s.readObject();
putForCreate(key, value);
}
If someone wants to play around with the size limits of HashMap I
suggest taking the source code of HashMap and change the type of the
size field from an int to a short, in which case you can test this
with less the xx GB of heap.
There are probably other map implementations in the JDK with the same issues.
Cheers
Kasper