Digits (Was: What a difference a glyph makes...)

2000-07-26 Thread Valeriy E. Ushakov
On Wed, Jul 26, 2000 at 12:02:15 -0800, [EMAIL PROTECTED] wrote: > This reminds me of "Are DIGIT SEVEN and DIGIT SEVEN > WITH STROKE distinct characters?" Yeah, our decimal > number system has at least thirteen digits: > DIGIT ONE Add another ONE here: digit one with bottom stroke: /| _|

RE: Digits (Was: What a difference a glyph makes...)

2000-07-26 Thread Figge, Donald
. . . and still another digit one, non-tabular, for fine typography. And, of course, there are the old-style digits. Don // -Original Message- From: Valeriy E. Ushakov [mailto:[EMAIL PROTECTED]] Sent: Wednesday, July 26, 2000 3:19 PM To: Unicode List Subject: Digits (Was: What a

RE: Digits (Was: What a difference a glyph makes...)

2000-07-27 Thread 11digitboy
> Don > // > > -Original Message- > From: Valeriy E. Ushakov [mailto:[EMAIL PROTECTED]] > Sent: Wednesday, July 26, 2000 3:19 PM > To: Unicode List > Subject: Digits (Was: What a difference a glyph > makes...) > > > On Wed, Jul 26, 2000 at 12:02:15 -0800, [EMAI

RE: Digits (Was: What a difference a glyph makes...)

2000-07-27 Thread Peter_Constable
>2) Has Unicode code-points for bold, italic, etc.? >text? No. >Sometimes that is important to the meaning >of a text. So is language (what does "chat" mean?). Colour, point size, typeface, layout on the page - lots of things can be important to the meaning of a text that are not encoded in U

Re: Digits (Was: What a difference a glyph makes...)

2000-07-27 Thread John Cowan
[EMAIL PROTECTED] wrote: > The line gets drawn somewhere, and > there's a very strong consensus that Unicode is right in not having > abstract characters to denote things like bold and italic. Except in math symbols, where Unicode will soon acquire them. In math, _sin_ would be the product of _

Re: Digits (Was: What a difference a glyph makes...)

2000-07-27 Thread Peter_Constable
On 07/27/2000 02:12:20 PM John Cowan wrote: >> The line gets drawn somewhere, and >> there's a very strong consensus that Unicode is right in not having >> abstract characters to denote things like bold and italic. > >Except in math symbols, where Unicode will soon acquire them. In >math, _sin_