Hi Magen,

Brief design points for a similar home-grown product that is 10 years in 
production:

a) define a DB2 "workload" table where each batch process will post the DS-name 
of a dataset containing the "new work"; in our case this table is populated by 
the last step of the NDM transmission JCL [or ANY other job that wants a 
sequential file to be "processed"], so whenever a new batch file is created, 
its name is inserted in the DB2 table

b) develop a long-running multi-tasking job, that would have a single task to 
periodically scan the above DB2 table loking for "new work" and once found, 
attach yet another subtask to "process" this particular "work"; in our case 
this subtask is merely copying the original sequential file to an MQ queue 
drained by multiple CICS draining tasks, while the whole job stays up for a 
full week before getting recycled; one more subtask is design to process 
operators commands that allow to pause, resume, stop transmission, etc.
[actually, there are 2 such subtasks, the second is "listening" on an MQ queue 
designed to be populated from a CICS screen]  

c) you can also add much more granular control to the above by having some 
external "rules" [in our case DB2 tables] that would dictate exactly how to 
recognize and manage a particular data type [how to validate data headers, what 
MQ queue(s) to choose, how to recover after a failure, etc.]

Please, e-mail me offline if you'd like a clarify some questions.               
     
HTH,
-Victor-

==============================================================================
In order to make systems more "online" we are looking for a way to 
run the process for each record all day long instead of a daily run, 
and doing so with minimum as possible application changes.

....

Magen

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@bama.ua.edu with the message: INFO IBM-MAIN

Reply via email to