On Wed, 5 Oct 2016 06:35:52 +0000, Martin Mueller wrote: […] > That said, given that alphabets have fixed numbers, it’s weird > that bits of super and subscripted letters appear in this or > that limited range but that you can’t cobble a whole alphabet > together in a consistent manner.
Indeed your point looked good to me, and it does again. Hereʼs why: > If any , why not all, especially > if there are only two or three dozen. Phonetics typically use Latin script as a basis. Like mathematics use bold, italic, script, sans-serif and double-struck, phonetics use superscript, subscript, and small caps. From a Unicode viewpoint, phonetics are not less important than mathematics. Mathematicians have been granted more than one dozen complete or completing alphabets of preformatted characters. Phoneticists have never been granted any complete alphabet. They must always prove their needs in detail, whereas mathematicians have full liberty in choosing variables. According to my hypothesis and while waiting, I believe that the intent of the gap kept in the superscript lowercase range, is to maintain a limitation to the performance of plain text. I donʼt see very well how to apply Hanlonʼs razor here, because there seems to be a strong unwillingness to see people getting keyboards that allow them to write in plain text without being bound to high-end software. The goal seems to be to keep the users dependent on a special formatting feature and to draw them away from simplicity. This results clearly from the weird arguments that were thrown against the proposal of *MODIFIER LETTER SMALL Q. The comment on behalf of Adobe had only a slight resemblance of commenting the proposal as such, […]. Trying to sum up: By encoding these few characters, there would indeed be a door that is thrown wide open. However, it has then been pointed out that there would be *no rush* through that door. Regards, Marcel