Lada Kugis wrote:
[snip]
Normal integers are up to 10 digits, after which they become long
integers, right ?

But if integers can be exactly represented, then why do they need two
types of integers (long and ... uhmm, let's say, normal). I mean,
their error will always be zero, no matter what kind they're of.

'int' is limited to, say, 32 bits, but is faster. 'long' is slower, but
virtually unlimited. The decision was made that with the speed of modern
CPUs we could simplify things by forgetting about 'int' and using just
'long', though renamed to 'int', in Python 3.x.
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to