On Sat, Feb 24, 2018 at 5:24 AM, Chris Barker <chris.bar...@noaa.gov> wrote:
> On Thu, Feb 22, 2018 at 6:21 PM, Nick Coghlan <ncogh...@gmail.com> wrote:
>>
>> > (I wonder if the discrepancy is due to some internal interface that
>> > loses
>> > the distinction between None and 1 before the decision is made whether
>> > to
>> > use advanced slicing or not. But that's a possible explanation, not an
>> > excuse.)
>>
>> That explanation seems pretty likely to me, as for the data types
>> implemented in C, we tend to switch to the Py_ssize_t form of slices
>> pretty early, and that can't represent the None/1 distinction.
>>
>> Even for Python level collections, you lose the distinction as soon as
>> you call slice.indices (as that promises to return 3-tuple of
>> integers).
>
>
> If this is the case -- backward compatibility issues aside, wouldn't it be
> very hard to fix?
>
> Which means that should be investigated before going to far down the "how
> much code might this break" route.
>
> And certainly before adding a Deprecation Warning.

Ignoring backward compatibility, it ought to be possible to (ab)use a
stride of zero for this. Calling slice.indices() on something with a
stride of zero raises ValueError, so there's no ambiguity. But it
would break code that iterates in a simple and obvious way, and (ugh
ugh) break it in a very nasty way: an infinite loop. I'm not happy
with that kind of breakage, even with multiple versions posting a
warning.

In the C API, there's PySlice_GetIndices "[y]ou probably do not want
to use this function" and PySlice_GetIndicesEx, the "[u]sable
replacement". Much as I dislike adding *yet another* function to do
basically the same job, I think that might be the less-bad way to do
this.

ChrisA
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to