>I don't want to offend anyone, but if you're worried about CPU microseconds 
>and coding in high-level languages, I would suggest there is a fundamental 
disconnect and it makes me think you're not really serious.

Hear! Hear!
I cannot count the number of times I have overseen an upgrade to a processor, 
and had management complain that throughput, or response, had not improved.

Most of the time, I had already told them that there was an I/O bottleneck, 
tape-drive contention, scheduling, etc. issue.
But, they told me to upgrade the processor, anyways.
Yes, we usually needed the processor, but the other issues would usually give 
us a better improvement.

Empirically, I have seen that the I/O content of most processing (online or 
batch) has increased since the early 1980's. This is especially true if you are 
using DB2.

I have had a lot of one-hour+ jobs using less than 2 minutes of CPU. We call 
these I/O-bound jobs.
So, even if I double the speed of the CPU, I have improved these long running 
jobs by one minute.

Microseconds don't count, any more.

-
Too busy driving to stop for gas!

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to