On Wed, Apr 19, 2017 at 05:56:18PM +0000, Stanislav Blinov via 
Digitalmars-d-learn wrote:
> On Wednesday, 19 April 2017 at 17:34:01 UTC, Jonathan M Davis wrote:
> > Personally, I think that we should have taken the stricter approach
> > and not had integral types implicit convert to character types, but
> > from what I recall, Walter feels pretty strongly about the
> > conversion rules being the way that they are.
> 
> Yep, me too. Generally, I don't think that an implicit conversion (T :
> U) should be allowed if T.init is not equivalent to U.init.

Me three.

Implicit conversion of int to char/dchar/etc. is a horrible, horrible
idea that leads to hard-to-find bugs. The whole point of having a
separate type for char vs. ubyte, unlike in C where char pretty much
means byte/ubyte, is so that we can keep their distinctions straight,
not to continue to confuse them in the typical C manner by having their
distinction blurred by implicit casts.

Personally, I would rather have arithmetic on char (wchar, dchar)
produce results of the same type, so that no implicit conversions are
necessary.  It seems to totally make sense to me to have to explicitly
ask for a character's numerical value -- it documents code intent, which
often also helps the programmer clear his head and avoid mistakes that
would otherwise inevitably creep in. A few extra keystrokes to type
cast(int) or cast(char) ain't gonna kill nobody. In fact, it might even
save a few people by preventing certain kinds of bugs.


T

-- 
Gone Chopin. Bach in a minuet.

Reply via email to