Greetings,

        I'm in the process of writing a large network monitoring system in
perl.  I want to be sure I'm headed in the right direction, however.

        I have a large MySQL database comprised of all the items that need
monitoring.  Each individual row contains exactly one monitoring type
(although, i would love to be able to combine this efficiently)

        One of the tables will contain the individual monitoring types and the
name of the program that processes them.  I'd like to have a centralized
system that deals with spawning off these processes and monitoring those
to ensure they are running correctly.  I'm looking to spawn each process
with the information it needs to process instead of it having to contact
the database and retrieve it on it's own.  This is where I'm stuck.  The
data it needs to process can be fairly large and I'd rather not drop
back to creating an external text file with all the data.  Is there a
way to push a block of memory from one process to another?  Or some
other efficient way to give the new process the data it needs?  Part of
the main program will be a throttling system that breaks the data down
into bite size chunks based on processor usage, running time, and memory
usage.  So, in order to properly throttle the processes, I need to be
able to pass some command line parameters in addition to the data
chunk...

        Has anyone attempted anything like this?  Do I have a snowball's
chance?  :)

-- 
---------------------------
Jason 'XenoPhage' Frisvold
Senior ATM Engineer
Penteledata Engineering
[EMAIL PROTECTED]
RedHat Certified - RHCE # 807302349405893
---------------------------
"Something mysterious is formed, born in the silent void. Waiting alone
and unmoving, it is at once still and yet in constant motion. It is the
source of all programs. I do not know its name, so I will call it the
Tao of Programming."


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to