On May 17, 11 03:58, Steven Schveighoffer wrote:
On Mon, 16 May 2011 15:47:55 -0400, KennyTM~ <kenn...@gmail.com> wrote:

On May 17, 11 02:25, Steven Schveighoffer wrote:
On Mon, 16 May 2011 13:51:55 -0400, Steven Schveighoffer
<schvei...@yahoo.com> wrote:

Currently, this works:

void foo(dchar i)
{
}

void main(string[] args)
{
foo(args.length);
}

Damn, this originally started out as argc and argv, and I forgot how D
accepts arguments, so I switched it to this. Unsigned ints are
convertable to dchar, but signed ones are not (except for a couple
cases, which doesn't make sense).

For example, this fails:

dchar c = -1;
foo(-1);


This fails because the compiler can check in compile-time that
0xffff_ffff is > 0x10_ffff....

That is not what the error suggests:

Error: cannot implicitly convert expression (-1) of type int to dchar

Seems like it's the type that's failing. At the very least, the error
message should stress it's the value, not the type, that is the culprit.

If I change the literal to 0x10_ffff, indeed it passes, and 0x11_0000
fails.

I think this is probably the right behavior, it can be useful to use
binary or hex numbers to construct characters.


But this passes:

int i = -1;
dchar c = i;

....but this cannot. 'dchar' should be treated as lower-rank than
'int' and use value-range propagation on it.

I'm not sure what you mean, I think the dchar init statement should
fail. I think this is what you are saying, right?


Right.

Note in the bug report previously referenced, the compiler blindly
accepts -1 as a dchar argument.

-Steve

The compiler blindly accepts an 'int' as a 'dchar' argument, not -1. For instance,

void main(){
    string ret;
    ret ~= 0x10ffff;  // ok
    ret ~= 0x110000;  // Error: cannot append type int to type string
    int i = 0x110000;
    ret ~= i;         // ok, but should fail at compile time
}

the issue is also that an 'int' shouldn't be implicitly convertible to a 'dchar'.

Reply via email to