> I agree that 1-based indexing works pretty well with closed-right > intervals; forgot to mention that in my first response. Even with such > an approach, certain things are less obvious, e.g. the number of > elements in an interval a..b is b-a+1, not b-a.
Yes, but in C too, when going from a[0] to a[N-1], people know there are... (N-1)-0+1 elements (so, b-a+1). It is the same. Now, why: for(iterator from a[0] to a[N-1]){ //etc. } //let use the above notation for for(i=0; i<=N-1; i++) is acceptable, but sudden is no more acceptable to write a[for(iterator from 0 to N-1)] and one must use a[for(iterator from 0 to N]] in order to achieve exactly the same? The last two expressions are just mental placeholders for a[0..N-1] and for a[0..N] respectively. > All in all D doesn't attempt to break new ground with open-right > intervals. It would be gratuitously and jarringly different from all of > its major influencers. Though I understand the choice looks odd coming > from Matlab, the same could be said about a lot of other languages. I don't see that ground. Maybe I simply lack information. Can you help?