I have gotten dragged into a CPU performance question; a field I know little
about.

 

I run a test on a 2094-722. It is rated at 19778 SU/Second. The job consumes
.146 CPU seconds total.

 

I run the same job on a 2064-2C3. It is rated at 13378 SU/Second. All other
things being roughly equal, should I expect that the job will consume 1.48
(19778/13378) times as much CPU time, or .216 CPU seconds?

 

Is my logic right, or am I off somewhere? I'm not worried about a
millisecond or two; just the broad strokes.

 

Thanks,

Charles 


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to