On 10/22/2011 02:14 PM, Jacob Carlborg wrote:
On 2011-10-21 20:38, Walter Bright wrote:
On 10/21/2011 2:51 AM, Martin Nowak wrote:
You have a good point here. I would have immediately thrown out the
loop AFTER
profiling.
What hits me here is that I had an incorrect program with built-in
unicode aware
strings.
This is counterintuitive to correct unicode handling throughout the
std library,
and even more to the complementary operation of appending any char
type to strings.

I understand the issue, but I don't think it's resolvable. It's a lot
like the signed/unsigned issue. Java got rid of it by simply not having
any unsigned types.

Can't we implement a new string type that people can choose to use if
they want. It will hide all the Unicode details that has been brought up
by this thread.


Having multiple standard string types is bad. Furthermore, it is hard to meaningfully hide all the Unicode details. Not even immutable(dchar)[] necessarily encodes one character as one code unit.

Reply via email to