There is a (very) small argument for unsigned array indexes in C/C++ where 
indexing is from zero, but as Stefan says, for Julia, with indexing 
starting at one, you have to test both limits anyway.

On Tuesday, September 22, 2015 at 1:11:03 AM UTC+10, Stefan Karpinski wrote:
>
> Suppose you have a situation where any UInt value is valid. Since only 
> valid values can be represented, this implies that you cannot validate 
> inputs at all – it's impossible to represent any invalid values. So you're 
> definitely not doing anything useful to catch or report errors.
>
> Suppose that non-negativity isn't the only criterion for validity. E.g. 
> array indices: only integral values from 1 to length(v) are valid. That 
> means you need to check bounds anyway, which means the UInt type doesn't 
> help at all since you still need to check that the UInt value is ≥ 1 and ≤ 
> length(v). In fact, using UInt mostly means that you can't easily tell if 
> you tried to use a value that is too large or too small – arithmetic 
> wrap-around means you'll almost always get values that are too large.
>
> Using unsigned integers for error checking is a bogus argument. They hide 
> errors rather than helping surface or debug them. If you index into an 
> array and you give an invalid index, you'll get a bounds error. Any type 
> you build on top of arrays will inherit this checking. If there are 
> additional constraints that your APIs have on valid inputs, they should 
> check them.
>
> On Mon, Sep 21, 2015 at 8:05 AM, Milan Bouchet-Valat <nali...@club.fr 
> <javascript:>> wrote:
>
>> Le dimanche 20 septembre 2015 à 17:07 -0400, Jesse Johnson a écrit :
>> > In a thread about printing UInt variables, Milan Bouchet-Valat said:
>> > > The point is, in Julia using unsigned ints to store values that
>> > > should
>> > > always be positive is *not* recommended.
>> > If that is true, then shouldn't the type be called Byte? It seems the
>> > type has been misnamed if it was never intended to store unsigned
>> > integers.
>> >
>> > Further, calling the type UInt is misleading to devs from C lang
>> > family
>> > who frequently depend on compile-time type checking (ex. int vs.
>> > uint)
>> > to help ensure no unexpected signs show up. I am not suggesting
>> > type-checking is a perfect defense against sign errors, and thorough
>> > runtime testing is definitely necessary. In my larger projects
>> > combining
>> > type checking and runtime tests is almost a practical necessity and
>> > can
>> > seriously cut down on time spent bug hunting sign errors.
>> >
>> > That said, I am guessing the suggested solution in Julia is to rely
>> > solely on runtime sign checking? I can't see how I could make that
>> > practical for my use cases, but it would be good to know if that is
>> > what the Julia devs intend.
>> I think so. One of the references I could find is this:
>> https://groups.google.com/d/msg/julia-users/RX8sFQHvEV4/ttxfYufL7WUJ
>>
>>
>> Regards
>>
>
>

Reply via email to