My company is developing a chargeback system and they want to charge for servers based on CPU utilization. We use the
processing rate of the CPU (megahertz) to normalize the CPU times to a standard value, using the megahertz rating as a surrogate for
instruction processing rate. However, this is probably not very accurate and it is difficult to explain to the users. We would like
to use an industry standard charging unit, if there is such a thing. Is anybody else out there charging by CPU utilization? How do
you calculate it and how do you report it? What units do you use?
May 10 2011, 12:44 PM