On Fri, Dec 9, 2016 at 5:37 AM, Mikhail V <mikhail...@gmail.com> wrote:
>> You have to show
>> that decimal isn't just marginally better than hex; you have to show
>> that there are situations where the value of decimal character
>> literals is so great that it's worth forcing everyone to learn two
>> systems. And I'm not convinced you've even hit the first point.
>
> Frankly I don't fully understand your point here.

Let me clarify. When you construct a string, you can already use
escapes to represent characters:

"n\u0303" --> n followed by combining tilde

In order to be consistent with other languages, Python *has* to
support hexadecimal. Plus, Python has _already_ supported hex for some
time. To establish decimal as an alternative, you have to demonstrate
that it is worth having ANOTHER way to do this.

With completely green-field topics, you can debate the merits of one
notation against another, and the overall best one will win. But when
there's a well-established existing notation, you have to justify the
proliferation of notations. You have to show that your new format is
*so much* better than the existing one that it's worth adding it in
parallel. That's quite a high bar - not impossible, obviously, but you
need some very strong justification. At the moment, you're showing
minor advantages to decimal, and other people are showing minor
advantages to hex; but IMO nothing yet has been strong enough to
justify the implementation of a completely new way to do things -
remember, people have to understand *both* in order to read code.

ChrisA
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to