"Mark R. Lindsey" <[email protected]> writes: > Koomey [1] reported recently that a a mid-range server uses 424 > watts, on average, in the US. He also suggested that power and loss > due to electrical distribution is roughly equal to actual server > consumption, so that the total electrical power to run this server is > 2.0*424 = 848 Watts/server.
you are missing something pretty big here: every watt of power your server eats gets turned into heat, and the data center spends anoutehr 2-4 watts pumping that heat out. This is usually priced into co-lo power (and why co-lo power usually is quite a bit more expensive than regular grid power.) Even so, people are usually still more expensive until you become very efficent. For my company, though, people still cost more than power. hell, I'm paying myself a salary of $2500/month ($3.5K if you include full load, health insurance, tax, accounting overhead. still, that's way below market for a west-coast SysAdmin. I made about that much right out of high school.) I've got 2 contractors who are paid hourly, both less experienced than I am and underpaid. (one is my brother.) I'm paying for 74 amps or so of usable power, and that's setting me back, oh, if we include rack cost in there, $2600 or so a month. So yeah, even in my case, even if you only include the puny stipend I give myself, and you consider that for much of my power I am paying sucker rates 'cause it is quite difficult for me to move out of my older, smaller digs into newer, larger, less expensive cages, people are more expensive than power. On the other hand, with the resources I have, I could probably support 10x the hardware I currently have without automating much more, and at that point, even paying above market SysAdmin wages, power would be the big cost. _______________________________________________ Discuss mailing list [email protected] http://lopsa.org/cgi-bin/mailman/listinfo/discuss This list provided by the League of Professional System Administrators http://lopsa.org/
