On Sun, Nov 28, 2010 at 7:04 PM, Antoine Pitrou <[email protected]> wrote:
> On Sun, 28 Nov 2010 15:58:33 -0500
> Alexander Belopolsky <[email protected]> wrote:
>
>> On Sun, Nov 28, 2010 at 3:43 PM, Antoine Pitrou <[email protected]> wrote:
>> ..
>> >> For example,
>> >> I don't think that supporting
>> >>
>> >> >>> float('١٢٣٤.٥٦')
>> >> 1234.56
>> >>
>> >> is more important than to assure users that once their program
>> >> accepted some text as a number, they can assume that the text is
>> >> ASCII.
>> >
>> > Why would they assume the text is ASCII?
>>
>> def deposit(self, amountstr):
>>       self.balance += float(amountstr)
>>       audit_log("Deposited: " + amountstr)
>>
>> Auditor:
>>
>> $ cat numbered-account.log
>> Deposited: ?????.??
>
>
> I'm not sure that's how banking applications are written :)
>
+1 for this being bogus  - I see no correlation whatsoever in numbers
inside unicode having to be "ASCII" if we have surpassed all technical
barriers for needing to behave like that.  ASCII is an
oversimplification of human communication needed for computing devices
not complex enough to represent it fully.

Let novice C programmers in English speaking countries deal with the
fact that 1 character is not 1 byte anymore. We are past this point.

  js
  -><-



> Antoine.
> _______________________________________________
> Python-Dev mailing list
> [email protected]
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> http://mail.python.org/mailman/options/python-dev/jsbueno%40python.org.br
>
_______________________________________________
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to