I think Jill has a point.
Kenneth Whistler wote:
Basically, thousands of implementations, for decades now,
have been using ASCII 0x30..0x39, 0x41..0x46, 0x61..0x66 to
implement hexadecimal numbers. That is also specified in
more than a few programming language standards and other
standards.
From: Pim Blokland [EMAIL PROTECTED]
On a related note, can anybody tell me why U+212A Kelvin sign was
put in the Unicode character set?
I don't have a definitive answer, may it may have existed two encodings
in
a legacy charset which made the difference. I suspect this occured in
Chinese or
From: John Cowan [EMAIL PROTECTED]
If you have ever wondered if you are in hell, John Cowan
it has been said, then you are on a well-traveled
http://www.ccil.org/~cowan
road of spiritual inquiry. If you are absolutely
http://www.reutershealth.com
sure you are in hell, however, then
John Cowan beat me to the punch with some of this, but anyway...
Pim Blokland pblokland at planet dot nl wrote:
Basically, thousands of implementations, for decades now,
have been using ASCII 0x30..0x39, 0x41..0x46, 0x61..0x66 to
implement hexadecimal numbers. That is also specified in
more
Pim Blockland posted:
Kenneth Whistler wote:
Basically, thousands of implementations, for decades now,
have been using ASCII 0x30..0x39, 0x41..0x46, 0x61..0x66 to
implement hexadecimal numbers. That is also specified in
more than a few programming language standards and other
standards. Those
On 16/08/2003 13:14, Doug Ewell wrote:
You could make a case for proposing numeric values of 10 through 15 to
be added to U+0044 through U+0049 and U+0064 through U+0069, based on
their undeniably widespread use as hexadecimal digits. (No, I don't
want to get into a debate about the word digit
On 16/08/2003 14:57, Herbert Elbrecht wrote:
well -
here is what I get
on Mac OS X 10.2.6
with Apple Mail:
Herbert
# # #
1,
A,
I actually get the same in the viewing window of Mozilla 1.4 on Windows
2000, and now in the compose window, but the display was different when
I
On 2003.08.14, 00:52, Anto'nio Martins-Tuva'lkin
[EMAIL PROTECTED] wrote:
If the dollar sign can be used for currencies other than the USD,
even for some which name is not even dollar, then I suppose there is
a theoreitical possiblity that it may be used as a symbol of euro cent
(though I
On 2003.08.16, 15:32, Philippe Verdy [EMAIL PROTECTED] wrote:
I suppose that the difference was needed to disambiguate scientific
text transmitted in uppercase-only form on poor devices unable to
represent lowercase letters, and where a text like KJ/K would
have been difficult to interpret as
On 2003.08.14, 05:24, John Cowan [EMAIL PROTECTED] wrote:
Anto'nio Martins-Tuva'lkin scripsit:
Some habits are indeed language dependant, but some others are just
tradition (some of it imposed as logic and correct decades ago, like
compulsive caseless singular for SI units in speech), and
From: Peter Kirk [EMAIL PROTECTED]
But I am not suggesting that this problem is sufficiently serious to
justify encoding a new set of hex digits.
The idea to separately encode digits used to express numbers in base
notation other than decimal is insane.
Think of it the following way: numbers
Adam Twardoch list dot adam at twardoch dot com wrote:
Seriously: since no writing system natively uses hexadecimal digits
(except for a bunch of crazy programmers), there is no reason encoding
them.
For an example of a constructed, and NOT AT ALL widely used, writing
system that does include
OnSaturday, August 16, 2003 5:03 PM, Peter
Kirk wrote:
I wonder if this is a real, legitimate and
non-pathological case where there might be a difference: hex digits
embedded in Hebrew text, followed by a comma. In modern Hebrew numbers
are written with European digits LTR embedded in
From: Anto'nio Martins-Tuva'lkin [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Saturday, August 16, 2003 10:22 PM
Subject: Re: Handwritten EURO sign (off topic?)
On 2003.08.14, 05:24, John Cowan [EMAIL PROTECTED] wrote:
Anto'nio Martins-Tuva'lkin scripsit:
Some habits are indeed language
14 matches
Mail list logo