> I have a programm written in Cobol which runs monthly on a z/Os-System. > The programm reads inputrecords and writes them in a database (DB2) and in > different outputfiles. It consumes nearly the same numbers of input > records every month. I wonder now, why there are so great differences in the > consumed cpu-time. For example: Last month the programm consumed due to > the job protocol 11.25 (what ever that means) and this month it consumed > 16.33
If the COBOL program simply looped it would use a certain amount of CPU time and that amount of time would be more or less the same every time you ran it on a given machine. However your COBOL program does useful work, reading and writing data. It consumes system services while doing that and those system services are related to the amount of data being processed. So if the amount of data being processed is "about the same" then you would expect the amount of system resources (including CPU) to be about the same. You are seeing roughly a 5 second (45%) increase in CPU consumption, so that is far outside of what you might expect from random variations. Are you sure your program is processing the same amount of data? If the amount of data really is about the same then you have to look to the organization of the data. For your input files; the block size may have changed which could lead to an increase in CPU time, but it is unlikely that you would see such a big change. For your output; there are lots of minor changes that can have major impacts on performance in DB2. That pretty much leaves DB2 as the potential culprit. I would look there first. CC ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO Search the archives at http://bama.ua.edu/archives/ibm-main.html