Based on what we have been discussing I assume my 400 watt power supply may be drawing much less power based on actual usage. Therefore maybe my computer might only be using 60 watts... making the cost lower.

Your thoughts.



On 2021-07-22 21:39, Mike Bushroe via PLUG-discuss wrote:
I usually use a mental rule of thumb that for every watt of 24/7/365
power consumption costs about $1 per year. Obviously this is failing
as electric rates keep going up. So to first order of magnitude a 100
watt server would cost around $100 a year, but if the server was using
the whole 400 watts it would cost more like $400 a year.

If my home web server is using 100 watts an hour that mean 100
watts *>> 30 days * 24 hours or 72K watts.

I'm thinking 72 * .1085 = $7.81 a month.

               KINDNESS

is most VALUABLE when it is GIVEN AWAY for

                   FREE
---------------------------------------------------
PLUG-discuss mailing list - PLUG-discuss@lists.phxlinux.org
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss
---------------------------------------------------
PLUG-discuss mailing list - PLUG-discuss@lists.phxlinux.org
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss

Reply via email to