It was nice to meet people at the pub yesterday and play Set!

This isn't really Perl-specific, it's Unix specific, but I plan to
implement the solution in Perl and you seem like a good crowd to ask.

What's the cleanest way to make sure at most N processes are doing X
at once, and anyone else wishing to do X blocks until one of those N
are finished?

Context: I'm writing a new version of the TrustFlow trust metric for
LiveJournal ( http://www.gothboffs.co.uk/trustflow/trustflow.pl ).
The CGI behind it all forks off a sub-process to do the actual work of
calculating your list.  However, to prevent the machine getting
overloaded, it refuses to do that if 5 such are already running.

The existing way of doing this is a hack: I have a directory with five
lock files, and it tries to get a lock on each of those five in turn
before proceeding.  If it can't lock any of them, it returns an error
to the user ("this machine is overloaded, sorry!").

This is a bit hacky but basically works.  However, I now want to break
part of the work of these calculation processes into sub-processes
that can run in parallel, and I want to do something similar with
these sub-processes, with one change - if there are already N
sub-processes, I don't want to abort as before, I want to block until
one of the sub-processes finishes and I can start another.

This is hard.  I could choose a lock at random and wait for that, but
this is suboptimal - I want to wait until *any* process finishes, not
until one particular one finishes.

Any ideas?
-- 
  __  Paul Crowley
\/ o\ [EMAIL PROTECTED]
/\__/ http://www.ciphergoth.org/

Reply via email to