On 26/03/2018 03:35, Richard Damon wrote:
On 3/25/18 9:37 PM, bartc wrote:

So the overhead /can/ be substantial, and /can/ be significant compared with doing bignum calculations.

Of course, once initialised, C might be used a hundred times, then the overhead is less significant. But it is not small enough to just dismiss.

And my point is that writing a program to just add or multiply once two FIXED big long numbers (since they are in the source code, that seems fixed), a million times seemsĀ  unlikely (and even then the cost isn't that bad, since that sounds like a run once program).

Similar overheads occur when you use string=>int even on small numbers:

This code:

    C = int("12345")
    D = C+C      # or C*C; about the same results

takes 5 times as long (using my CPython 3.6.x on Windows) as:

    C = 12345
    D = C+C

Your arguments that this doesn't really matter would equally apply here.

Yet you don't see Python code full of 'int("43")' instead of just '43' on the basis that the extra overhead is not significant, as the program might be run only once.

A slightly worrying attitude in a language that has issues with performance, but no longer a surprising one.

--
bartc
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to