Am 08.03.10 01:18, schrieb Paweł Banyś:
Hello,

I have already read about Python and multiprocessing which allows using
many processors. The idea is to split a program into separate tasks and
run each of them on a separate processor. However I want to run a Python
program doing a single simple task on many processors so that their
cumulative power is available to the program as if there was one huge
CPU instead of many separate ones. Is it possible? How can it be achieved?


That's impossible to answer without knowing anything about your actual task. Not everything is parallelizable, or algorithms suffer from penalties if parallelization is overdone.

So in essence, what you've read already covers it: if your "simple task" is dividable in several, independent sub-tasks that don't need serialization, multiprocessing is your friend.

Diez
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to