Jeff Westman wrote:
I'm posed with a problem, looking for suggestions for possible resolution. I
have a script that has many steps in it, including telnet & ftp sessions,
database unloads, and other routines. This script will run on a server,
accessing a remote server. This works fine. I will likely have several
dozen (maybe as many as 100) iterations of this script running
simultaneously. The problem is, that their is a "bottleneck" towards the end
of my script -- I have to call a 3rd party process that is single-threaded. This means that if I have ~100 versions of my script running, I can only have
one at a time execute the 3rd party software. It is very likely that
multiple versions will arrive at this bottle-neck junction at the same time. If I had more than one call the third party program, one will run, one will
loose, and die.


So I am looking for suggestions on how I might attack this problem.  I've
thought about building some sort of external queue (like a simple hash file).
 The servers have numbers like server_01, server_02, etc.  When a iteration
of the script completes, it writes out it's server name to the file, pauses,
then checks of any other iteration is running the third party software.  If
one is running, it waits, with it's server name at the top of the file queue,
waiting.  A problem might be if again, two or more versions want to update
this queue file, so I thought maybe a random-wait period before writing to
the file-queue.

I'm open to other ideas.  (please don't suggest we rename or copy the third
party software, it just isn't possible).  I'm not looking for code, per se,
but ideas I can implement that will guarantee I will always only have one
copy of the external third party software running (including pre-checks,
queues, etc.


Currently I am implementing a system that has similar features, initially we developed a set of 3 queues, one a pre-processor that handles many elements simultaneously, a middle queue (incidentally that handles external encryptions/decryptions) which are "very" slow (seconds rather than milli or micro seconds, and a final queue that handles sending of files, FTP/SMTP which can be "very very" slow (hours depending on FTP timeout limits...grrr I know....) For this we were looking for essentially an event based state machine concept, which (thank god) led my searching to POE (since I keep mentioning it, this is why): http://poe.perl.org After getting over the POE learning curve developing my queues was a snap. Because of business decisions we have since moved to a 9 queue system (inbound/outbound sets, plus a post processing queue, plus a reroute queue (don't ask)). Essentially a similar setup would work for you, where your "middle" queue would have a threshold of 1 (aka only one process at a time) whereas all of our stages are acceptable to have multiple versions running, but we want to limit the number of encryption processes happening simultaneously because of load rather than problems. You may also want to have a look at the Event CPAN module, it provides similar but lower level functionality.


I can provide more details about the implementation of our system and the development of our queues if you wish, but much to my dismay I cannot provide source... hopefully this will get you started in any case, be sure to check out the example POE uses, particularly the multi-tasking process example.

http://danconia.org


-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to