Paul Upchurch added the comment:
For the concept of "reasonable", it should be noted that this behaviour will
affect code that works with reasonably sized sequences despite the largeness of
the parameter.
Consider an extremely large array. To work with such an array, one would
typically break it into small segments. However, to simplify the code and
reduce bugs it makes sense to use a consistent indexing method on each segment.
The size of its parameter does not say anything about the size of a segment.
Consider a class which implements virtual arrays.
def __getitem__(...):
...
start,stop,step=slice.indices(start,stop,step).indices(12600000000)
while True:
if step>0 and start>=stop: break
if step<0 and start<=stop: break
p=pageid(start)
make_page_resident(p)
do work ...
start=start+step
As you can see, slice.indices should not be limited to sys.maxsize. If Python
can perform the arithmetic calculation sys.maxsize+1 then
slice.indices(sys.maxsize+1) should also work. The usage of slice.indices is to
ensure consistent behaviour of the slicing operator. Another workaround for
this bug:
5. write your own implementation of slice.indices
I consider this a workaround. The correct way to handle the index parameter to
__getitem__ and __setitem__ is to use slice.indices. That way, if the semantics
of slicing changes in future versions of Python, your class will behave
consistently. It seems to me that this is the main reason why slice.indices
exists at all: to prevent inconsistent behaviour when people implement
__getitem__ and __setitem__.
----------
_______________________________________
Python tracker <[email protected]>
<http://bugs.python.org/issue14794>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com