On Jan 27, 2009, at 2:35 PM, Hal Finney wrote:

John Gilmore writes:
The last thing we need is to deploy a system designed to burn all
available cycles, consuming electricity and generating carbon dioxide,
all over the Internet, in order to produce small amounts of bitbux to
get emails or spams through.

It's interesting to consider the ultimate technological resolution to this issue. Will a global-scale proof-of-work based system inherently consume
substantial amounts of energy? Or are there ways of doing computing
which would allow such a system to use only moderate energy consumption? ...
[Proposals to use reversible computation, which in principle consume no energy, elided.]

There's a contradiction here between the computer science and economic parts of the problem being discussed. What gives a digital coin value is exactly that there is some real-world expense in creating it. We talk about "proof of work", but in fact "work" done by a computer doesn't, in and of itself, have any value. It gets a value only when it's a limited resource *which might have been used for something else* - i.e., the value of the spare cycles that might be thrown at doing the computations comes from the opportunity cost incurred. If this were not so, anyone could just create as many as they wanted at no cost to themselves. In fact, this is behind the cost model 'bot herders using other people's machines. But ultimately that only works for the 'bot herders because there is no significant loss to the owners of those machines either!

Now, if instead we used algorithms not based on some abstraction notion of "work", but on the equivalent power that had to be dissipated to do the computation, then the value of a digital token would truly be grounded in the real world. Spare cycles would no longer be "free" - they would show up on your power bill. Sure, the 'bot herders wouldn't have to pay - but if the owners of the "pwned" machines saw a real cost, they would have an incentive to do something about it (which they basically don't, today).

Eliminating the power cost puts you back to amortizing the fixed cost of the CPU and memory doing the computation - a cost that's dropping all the time. I don't see how you get to an economically viable mechanism that way.

So, how do you tie the cost of a token to power? Curiously, something of the sort has already been proposed. It's been pointed out - I'm afraid I don't have the reference - that CPU's keep getting faster and more parallel and a high rate, but memories, while they are getting enormously bigger, aren't getting much faster. So what the paper I read proposed is hash functions that are expensive, not in CPU seconds, but in memory reads and writes. Memory writes are inherently non-reversible so inherently cost power; a high-memory-write algorithm is also one that uses power.

(BTW, a number of years back, a VC friend ran by me a proposal to buy the spare cycles on people's set-top boxes - which have pretty hefty chips in them - and rent out the resulting "distributed compute server". The claim was that you didn't have to pay people much of anything for use of their boxes - you'd only do it when they were otherwise unoccupied, so they should be happy to get even very small payments. I pointed out the cost they had neglected: Increased power use. Sure, individuals probably wouldn't notice - but at some point some consumer organization would. The resulting bad publicity would kill the business. We did a bit of calculation to add that in to what would be paid to the box owners and the whole enterprise started looking less interesting from a purely economic point of view - not that it didn't have plenty of other problems.)

                                                        -- Jerry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com

Reply via email to