You're not going to have that overhead for only one day.  0.5% of a $10
Million computer is $50,000. Of course, that ignores software costs and the
other things that have been mentioned.  To think only of CPU seconds
trivializes it.

Unfortunately these kinds of calculations are not only misleading, they are wrong. This simple example is based on the notion of the computer costing $10,000,000 every day. Whereas projecting this cost of $50,000 over the period of a year results in about $0.0015 per second in costs. So, in order to consume 0.5% of the available computing power over a year, then that also represents 0,5% of all the available time during that year, or 43.8 hours of computing time.

One of the other examples uses $0.22/second which results in nearly $7,000,000 per year in costs. So, its not too difficult to extrapolate that a 20-engine configuration would recover about $140,000,000 in costs, and so on up to 54-engines for the z9 processor.

These numbers and their calculations are ridiculous. The notion of a single number metric for chargeback is every bit as ludicrous as suggesting there is a single-number performance value.


Adam
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to