On 26/03/2018 00:27, Richard Damon wrote:
On 3/25/18 8:32 AM, bartc wrote:

Using CPython on my machine, doing a string to int conversion that specific number took 200 times as long as doing a normal assignment.

That conversion took 4 microseconds.

Not significant if it's only done once. But it might be executed a million times.


The other half of that thought is how does the 4 microseconds to create the constant compare to the operations USING that number. My guess is that for most things the usage will swamp the initialization, even if that is somewhat inefficient.

Calling a function that sets up C using 'C = 288714...' on one line, and that then calculates D=C+C, takes 0.12 seconds to call 1000000 times.

To do D=C*C, takes 2.2 seconds (I've subtracted the function call overhead of 0.25 seconds; there might not be any function call).

If I instead initialise C using 'C = int("288712...")', then timings increase as follows:

0.12  =>   3.7 seconds
2.2   =>   5.9 seconds

So the overhead /can/ be substantial, and /can/ be significant compared with doing bignum calculations.

Of course, once initialised, C might be used a hundred times, then the overhead is less significant. But it is not small enough to just dismiss.

--
bartc
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to