On Sat, Dec 6, 2008 at 9:43 PM, Damon Timm <[EMAIL PROTECTED]> wrote:
> The last piece of my puzzle though, I am having trouble wrapping my > head around ... I will have a list of files > ["file1.flac","file2.flac","file3.flac","etc"] and I want the program > to tackle compressing two at a time ... but not more than two at a > time (or four, or eight, or whatever) because that's not going to help > me at all (I have dual cores right now) ... I am having trouble > thinking how I can create the algorithm that would do this for me ... A simple way to do this would be to use poll() instead of wait(). Then you can check both processes for completion in a loop and start a new process when one of the current ones ends. You could keep the list of active processes in a list. Make sure you put a sleep() in the polling loop, otherwise the loop will consume your CPU! Another approach is to use a thread pool with one worker for each process. The thread would call wait() on its child process; when it finishes the thread will take a new task off the queue. There are several thread pool recipes in the Python cookbook, for example http://code.activestate.com/recipes/203871/ http://code.activestate.com/recipes/576576/ (this one has many links to other pool implementations) Kent _______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor