Brian Quinlan wrote:

On Dec 11, 2010, at 6:44 AM, Thomas Nagy wrote:

I have also observed a minor performance degradation with the executor replacement (3 seconds for 5000 work items). The amount of work items processed by unit of time does not seem to be a straight line: http://www.freehackers.org/~tnagy/runtime_futures_2.png .

That looks pretty linear to me.

Close to, but not quite. The graph seems to be slightly curved, with the amount of work done per second decreasing for large amounts of work. Assuming that this performance degradation is real, and not an artifact of the measurement technique, it seems to be quite small. I'd be happy to describe it as "linear" in the same way we describe dictionary lookups as constant-time, even though technically that's not strictly true. (They're linear in the number of items with a matching hash, and there are probably other complications as well.)

As drawn, the curve seems to fall away like a log graph, which might suggest to the casual viewer that this is a good thing. It may be better to reverse the axes, that is to have the independent variable, number of tasks, on the horizontal axis, and the dependent variable, cost (time taken), vertically. This will make it clear that the incremental cost of doing one extra task increases (slightly) as the number of tasks goes up.



--
Steven

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to