From: "Pim Blokland" <[EMAIL PROTECTED]>

> On a related note, can anybody tell me why U+212A Kelvin sign was
> put in the Unicode character set?

I don't have a definitive answer, may it may have existed two encodings
in
a legacy "charset" which made the difference. I suspect this occured in
Chinese or Japanese, where the Latin letters used for words are
represented
as half-width letters, and a separate full-width letter was introduced
to
represent SI units (where the half-width form would have been
not recommanded if appearing after a number).

If someone can find a legacy charset where such distinction existed, or
some justification why it was introduced in the first editions of
Unicode,
I'd like to know (now it is clearly deprecated).

> I have never seen any acknowledgement of this symbol anywhere in the
> real world. (That is, using U+212A instead of U+004B.)
> And even the UCD calls it a letter rather than a symbol. I'd expect
> if it was put in for completeness, to complement the degrees
> Fahrenheit and degree Celcius, it would have had the same category
> as those two?

Both degree Fahrenheit and degree Celsius use the same symbol "degree"
also used for angular units, but they are followed by a normal letter:
°F or °C (often this qualifying letter is missing in a contextual domain
such as localized contents, where such distinction is not necessary,
notably in Europe, where °F is almost never used and understood,
notably for meteorolical information, localized web portals, national
and regional newspapers, ...).

Kelvins are most often written without the degree sign according
to SI conventions, even if sometimes incorrectly called "degree Kelvin"
and abbreviated as °K. Given that Kelvins are used mostly in
scientific areas, there's no reason to keep this informal notation, when
SI simply uses "K".

I suppose that the difference was needed to disambiguate scientific
text transmitted in uppercase-only form on poor devices unable to
represent lowercase letters, and where a text like "KJ/K" would
have been difficult to interpret as "kJ/K" i.e. kilojoules per kelvin.
However this justification would be difficult to maintain, as the
same poor device (such as LCD displays) would have not
been able to represent the difference as well between a letter "K"
and a kelvin symbol.

I think this has more to do with some Japanese applications, which
may have assumed in some past time that all units of measure would
use a symbol rather than a letter to parse a text like "160K" as a
single
measure token, instead of a number and a separate token. Or it would
have been represented in Hiragana phonetically with a composed
square glyph (like many other units, notably currencies).

As SI clearly defines and recommands the use of the common notation
for SI units, there's no good justification to maintain such
distinction, as
the kelvin unit is clearly defined with only a latin capital letter K
standard
abbreviation, and there is no need to localize this international unit
(whose
name is inherited from a person name and invariably recognized with a
single name, sometimes with just a localized plural form).


Reply via email to