Charles-François Natali <[email protected]> added the comment:
> So it seems unlikely to be the explanation.
Victor reproduced in on IRC, and it's indeed an overflow.
The problematic code is in readline_file:
"""
bigger = self->buf_size << 1;
if (bigger <= 0) { /* overflow */
PyErr_NoMemory();
return -1;
}
newbuf = (char *)realloc(self->buf, bigger);
if (!newbuf) {
PyErr_NoMemory();
return -1;
}
"""
self->buf_size is an int, which overflow pretty easily.
>>> 196 * 240000
47040000
>>> 196 * 240000 * 8 # assuming 8 bytes per float
376320000
>>> 2**31
2147483648
Hmmm... A byte is 8 bit, which gives:
>>> 196 * 240000 * 8 * 8
3010560000L
>>> 196 * 240000 * 8 * 8 > 2**31
True
Now, if it works on your box, it's probably due to the compiler optimizing the
check away. Since `bigger` is cast to an unsigned 64-bit (size_t) when calling
malloc(), it happens to work.
Maybe your distro doesn't build python with -fwrapv.
So, what do you suggest? Should we fix this (Py_ssize_t, overflow check before
computation), as in #11564?
----------
_______________________________________
Python tracker <[email protected]>
<http://bugs.python.org/issue13555>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com