On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
import std.array : array; import std.stdio : writefln; import std.uni : byCodePoint, byGrapheme; import std.utf : byCodeUnit; void main() { string first = "á"; writefln("%d", first.length); // prints 2 auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!) writefln("%d", firstCU.length); // prints 2 auto firstGr = "á".byGrapheme.array; // type is `Grapheme[]` writefln("%d", firstGr.length); // prints 1 auto firstCP = "á".byCodePoint.array; // type is `dchar[]` writefln("%d", firstCP.length); // prints 1 dstring second = "á"; writefln("%d", second.length); // prints 1 (That was easy!) // DMD64 D Compiler v2.081.2 }
So Unicode in D works EXACTLY as expected, yet people in this thread act as if the house is on fire.
D dying because of auto-decoding? Who can possibly think that in its right mind?
The worst part of this forum is that suddenly everyone, by virtue of posting in a newsgroup, is an annointed language design expert.
Let me break that to you: core developer are language experts. The rest of us are users, that yes it doesn't make us necessarily qualified to design a language.