On 12/7/2016 7:22 PM, Mikhail V wrote:
On 8 December 2016 at 01:13, Nick Timkovich <prometheus...@gmail.com> wrote:
Out of curiosity, why do you prefer decimal values to refer to Unicode code
points? Most references, http://unicode.org/charts/PDF/U0400.pdf (official)
or https://en.wikibooks.org/wiki/Unicode/Character_reference/0000-0FFF ,
prefer to refer to them by hexadecimal as the planes and ranges are broken
up by hex values.

Well, there was a huge discussion in October, see the subject name.
Just didnt want it to go again in that direction.
So in short hex notation not so readable and anyway decimal is
kind of standard way to represent numbers and I treat string as a number array
when I am processing it, so hex simply is redundant and not needed for me.

I sympathize with your preference, but ... Perhap the hex numbers would bother you less if you thought of them as 'serial numbers'. It is standard for 'serial numbers' to include letters. It is also common for digit-letter serial numbers to have meaningful fields, as as do the hex versions of unicode serial numbers. The decimal versions are meaningless except as strict sequencers.

--
Terry Jan Reedy


_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to