Re: [Bitcoin-development] A way to create a fee market even without a block size limit (2013)

2015-05-10 Thread Sergio Lerner
El 10/05/2015 06:07 p.m., Gregory Maxwell escribió:
> On Sun, May 10, 2015 at 8:45 PM, Sergio Lerner
>  wrote:
>> Can the system by gamed?
> Users can pay fees or a portion of fees out of band to miner(s); this
> is undetectable to the network.
Then this is exactly what is needed. Let me explain.

I know of 5 methods for a user to pay fees to a miner. I will explain
each method and why these methods do not prevent the fee market from
being created:

1) By transaction fees

This is the standard, which would be limited by the CoVar algorithm, and
would create the fee market, if it were the only way to pay fees.

2) By creating multiple transactions, each adding an output that pays to
each miner (to a known miner address) the fees. User does not
pre-negotiate anything with miners.

This requires a transaction to have an additional output and requires
sending through the p2p network one different transaction to each miner,
each having an output with a "known" address of that miner. But the
network does not propagates double-spends, so those transaction would
need to be sent directly to the top miners, and to all at the same time.
The IP addresses of the top miners are not generally publicly available,
and then may not accept new incoming connections. Also having an
additional output means the transactions would be larger, so they will
score lower by any metric the miner uses to choose transactions. Last,
miners must be programmed to automatically interpret payments to their
addresses as fees. The resulting protocol is very difficult to do
reliably, expensive, as any delay would make one miner receive the
transaction from other miner and reject the double-spend that is being
send directly to it, increasing the average confirmation time.

3) By adding an anyone-can-spend output for fees, so the miner can spend
that output in the same block.  User does not pre-negotiate anything
with miners.

We can hard-fork not to allow spending outputs created in the same
block. This is a drawback, unless we reduce the block rate, which is my
proposal. However, spending in the same block also requires an storing
in the block an additional input, which consumes at least 40 bytes more,
and the transaction containing the input cannot be relayed to the
network in advance. Then the block that uses this method to collect fees
from many transactions will propagate slower, and the miner may end
loosing money. The any-one-can-spend output would take approximately 10
bytes. So if transmitting 10+40=50 bytes, cost more than the fees
earned, then miners do not have an incentive to game the system. It's
has been studied that "each kilobyte costs an additional 80ms delay
until a majority knows about the block." (Information propagation in the
Bitcoin network). So 50 bytes costs 3.9 ms in propagation time, which
having a a 25 BTC subsidy is roughly equivalent to 0.2 mBTC. Currently
this is more than what transactions do pay in fees (about 0.1 mBTC), so
this should not be a problem for at least 5 years. And again, we could
just prevent spending outputs in the same block they are created.

4) Using a transaction having a single input having exactly the desired
output amount plus fees and signing the input with SIGHASH_SINGLE |
SIGHASH_ANYONECANPAY and adding to the transaction a single output with
the desired amount. The miner will be able to join many of these
transactions and finally add an output to collect all fees together,
without using standard transaction fees.

This is unreliable and cannot be systematically repeated without
creating a pre-transaction just to prepare the single input having the
amount plus fees exactly. The pre-transaction would need to pay fees, so
the problem is not avoided, just moved around.

5) By negotiating out of band with the miner previously. Anything could
be agreed by the user and the miner.

This actually creates a parallel out-of-band market for fees, which is
exactly what we want. If a user-to-miner pre-negotiation will take
place, then the miner can establish whatever price policy he wants to
compete and stay in business, as block data propagation costs money. So
there will be two fee markets, the "out-of-band" market, and the
"in-band" market, and both should converge.

My conclusion is that fee markets will be created, and any alternate
fee-paying methods (without a pre-negotiation) are not reliable nor
cost-saving options. The full proposal would be to use the CoVar method,
reduce the block rate to 1 minute, and do not allow spending outputs in
the same block they are created.

Best regards,
 Sergio.



--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;11756

Re: [Bitcoin-development] A way to create a fee market even without a block size limit (2013)

2015-05-10 Thread Stephen
Why do so many tie the block size debate to creating "a fee market", as if one 
didn't already exist? Yes, today we frequently see many low priority 
transactions included into the next block, but that does not mean there is not 
a marketplace for block space. It just means miners are not being sufficiently 
tough to create a *competitive* marketplace. 

But who are we to say that the marketplace should be more competitive, and to 
go further and try to force it by altering consensus rules like the block size 
limit? If miners want to see more competitive fees, then they need only to 
alter their block creation protocol. 

There are many arguments for and against changing the consensus limit on block 
size. I'm simply saying that "to force a marketplace for fees/block space" 
should not be one of them. Let the market develop on it's own. 

- Stephen



> On May 10, 2015, at 4:45 PM, Sergio Lerner  wrote:
> 
> Two years ago I presented a new way to create a fee market that does not 
> depend on the block chain limit.
> 
> This proposal has not been formally analyzed in any paper since then, but I 
> think it holds a good promise to untangle the current problem regarding 
> increasing the tps and creating the fee market. BTW, think the maximum tps 
> should be increased, but not by increasing the block size, but by increasing 
> the block rate (I'll expose why in my next e-mail).
> 
> The original post is here (I was overly optimistic back then): 
> https://bitcointalk.org/index.php?topic=147124.msg1561612#msg1561612
> 
> I'll summarize it here again, with a little editing and a few more questions 
> at the end:
> 
> The idea is simple, but requires a hardfork, but is has minimum impact in the 
> code and in the economics.
> 
> Solution: Require that the set of fees collected in a block has a dispersion 
> below a threshold. Use, for example, the Coefficient of Variation 
> (http://en.wikipedia.org/wiki/Coefficient_of_variation). If the CoVar is 
> higher than a fixed threshold, the block is considered invalid.
> 
> The Coefficient of variation is computed as the standard deviation over the 
> mean value, so it's very easy to compute. (if the mean is zero, we assume 
> CoVar=0). Note that the CoVar function does not depend on the scale, so is 
> just what a coin with a floating price requires.
> 
> This means that if there are many transactions containing high fees in a 
> block, then free transactions cannot be included.
> The core devs should tweak the transaction selection algorithm to take into 
> account this maximum bound.
> 
> Example
> 
> If the transaction fee set is: 0,0,0,0,5,5,6,7,8,7
> The CoVar is 0.85
> Suppose we limit the CoVar to a maximum of 1.
> 
> Suppose the transaction fee set is: 0,0,0,0,0,0,0,0,0,10
> Then the CoVar is 3.0
> 
> In this case the miner should have to either drop the "10" from the fee set 
> or drop the zeros. Obviously the miner will drop some zeros, and choose the 
> set: 0,10, that has a CoVar of 1.
> 
> Why it reduces the Tx spamming Problem?
> 
> Using this little modification, spamming users would require to use higher 
> fees, only if the remaining users in the community rises their fees. And 
> miners won't be able to include an enormous amounts of spamming txs.
> 
> Why it helps solving the tragedy-of-the-commons fee "problem"?
> 
> As miners are forced to keep the CoVar below the threshold, if people rises 
> the fees to confirm faster than spamming txs, automatically smamming txs 
> become less likely to appear in blocks, and fee-estimators will automatically 
> increase future fees, creating a the desired feedback loop.
> 
> Why it helps solving the block size problem?
> 
> Because if we increase the block size, miners that do not care about the fee 
> market won't be able to fill the block with spamming txs and destroy the 
> market that is being created. This is not a solution against an 
> attacker-miner, which can always fill the block with transactions.
> 
> Can the system by gamed? Can it be attacked?
> 
> I don't think so. An attacker would need to spend a high amount in fees to 
> prevent transactions with low fees to be included in a block. 
> However, a formal analysis would be required. Miller, Gun Sirer, Eyal.. Want 
> to give it a try?
> 
> Can create a positive feedback to a rise the fees to the top or push fess to 
> the bottom?
> 
> Again, I don't think so. This depends on the dynamics between the each node's 
> fee estimator and the transaction backlog. MIT guys? 
> 
> Doesn't it force miners to run more complex algorithms (such as linear 
> programming) to find the optimum tx subset ?
> 
> Yes, but I don't see it as a drawback, but as a positive stimulus for 
> researchers to develop better tx selection algorithms. Anyway, the greedy 
> algorithm of picking the transactions with highest fees fees would be good 
> enough. 
> 
> 
> PLEASE don't confuse the acronym CoVar I used here with co-variance.
> 
> Best regard,
>   Sergio.
> 
> 
> 

Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Rob Golding
> How much will that cost me?
> The network is hashing at 310PetaHash/sec right now.
> Takes 600 seconds to find a block, so 186,000PH per block
> 186,000 * 0.00038 = 70 extra PH
> 
> If it takes 186,000 PH to find a block, and a block is worth 25.13 BTC
> (reward plus fees), that 70 PH costs:
> (25.13 BTC/block / 186,000 PH/block) * 70 PH = 0.00945 BTC
> or at $240 / BTC:  $2.27
> 
> ... so average transaction fee will have to be about ten cents ($2.27
> spread across 23 average-sized transactions) for miners to decide to
> stay at 600K blocks

Surely that's an *extra* $2.27 as you've already included .13BTC 
($31.20) in fees in the calculation ?

Rob

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Thomas Voegtlin


Le 11/05/2015 00:31, Mark Friedenbach a écrit :
> I'm on my phone today so I'm somewhat constrained in my reply, but the key
> takeaway is that the proposal is a mechanism for miners to trade subsidy
> for the increased fees of a larger block. Necessarily it only makes sense
> to do so when the marginal fee per KB exceeds the subsidy fee per KB. It
> correspondingly makes sense to use a smaller block size if fees are less
> than subsidy, but note that fees are not uniform and as the block shrinks
> the marginal fee rate goes up..
> 

Oh I see, you expect the sign of the dE/dx to change depending on
whether fees exceed the subsidy. This is possible, but instead of the
linear identity, you have to increase the block size twice as fast as
the difficulty. In that case we would get (using the notations of my
previous email):

D' = D(1+x)
F' = F(1+2x)

and thus:

E' - E = x/(1+x)P(F-S)

The presence of the (F-S) factor means that the sign reversal occurs
when fees exceed subsidy.


> Limits on both the relative and absolute amount a miner can trade subsidy
> for block size prevent incentive edge cases as well as prevent a sharp
> shock to the current fee-poor economy (by disallowing adjustment below 1MB).
> 
> Also the identity transform was used only for didactic purposes. I fully
> expect there to be other, more interesting functions to use.

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Mark Friedenbach
I'm on my phone today so I'm somewhat constrained in my reply, but the key
takeaway is that the proposal is a mechanism for miners to trade subsidy
for the increased fees of a larger block. Necessarily it only makes sense
to do so when the marginal fee per KB exceeds the subsidy fee per KB. It
correspondingly makes sense to use a smaller block size if fees are less
than subsidy, but note that fees are not uniform and as the block shrinks
the marginal fee rate goes up..

Limits on both the relative and absolute amount a miner can trade subsidy
for block size prevent incentive edge cases as well as prevent a sharp
shock to the current fee-poor economy (by disallowing adjustment below 1MB).

Also the identity transform was used only for didactic purposes. I fully
expect there to be other, more interesting functions to use.
On May 10, 2015 3:03 PM, "Thomas Voegtlin"  wrote:

> Le 08/05/2015 22:33, Mark Friedenbach a écrit :
>
> >   * For each block, the miner is allowed to select a different difficulty
> > (nBits) within a certain range, e.g. +/- 25% of the expected difficulty,
> > and this miner-selected difficulty is used for the proof of work check.
> In
> > addition to adjusting the hashcash target, selecting a different
> difficulty
> > also raises or lowers the maximum block size for that block by a function
> > of the difference in difficulty. So increasing the difficulty of the
> block
> > by an additional 25% raises the block limit for that block from 100% of
> the
> > current limit to 125%, and lowering the difficulty by 10% would also
> lower
> > the maximum block size for that block from 100% to 90% of the current
> > limit. For simplicity I will assume a linear identity transform as the
> > function, but a quadratic or other function with compounding marginal
> cost
> > may be preferred.
> >
>
> Sorry but I fail to see how a linear identity transform between block
> size and difficulty would work.
>
> The miner's reward for finding a block is the sum of subsidy and fees:
>
>  R = S + F
>
> The probability that the miner will find a block over a time interval is
> inversely proportional to the difficulty D:
>
>  P = K / D
>
> where K is a constant that depends on the miner's hashrate. The expected
> reward of the miner is:
>
>  E = P * R
>
> Consider that the miner chooses a new difficulty:
>
>  D' = D(1 + x).
>
> With a linear identity transform between block size and difficulty, the
> miner will be allowed to collect fees from a block of size: S'=S(1+x)
>
> In the best case, collected will be proportional to block size:
>
>  F' = F(1+x)
>
> Thus we get:
>
>  E' = P' * R' = K/(D(1+x)) * (S + F(1+x))
>
>  E' = E - x/(1+x) * S * K / D
>
> So with this linear identity transform, increasing block size never
> increases the miners gain. As long as the subsidy exists, the best
> strategy for miners is to reduce block size (i.e. to choose x<0).
>
>
> --
> One dashboard for servers and applications across Physical-Virtual-Cloud
> Widest out-of-the-box monitoring support with 50+ applications
> Performance metrics, stats and reports that give you Actionable Insights
> Deep dive visibility with transaction tracing using APM Insight.
> http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
> ___
> Bitcoin-development mailing list
> Bitcoin-development@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bitcoin-development
>
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Thomas Voegtlin
Le 08/05/2015 22:33, Mark Friedenbach a écrit :

>   * For each block, the miner is allowed to select a different difficulty
> (nBits) within a certain range, e.g. +/- 25% of the expected difficulty,
> and this miner-selected difficulty is used for the proof of work check. In
> addition to adjusting the hashcash target, selecting a different difficulty
> also raises or lowers the maximum block size for that block by a function
> of the difference in difficulty. So increasing the difficulty of the block
> by an additional 25% raises the block limit for that block from 100% of the
> current limit to 125%, and lowering the difficulty by 10% would also lower
> the maximum block size for that block from 100% to 90% of the current
> limit. For simplicity I will assume a linear identity transform as the
> function, but a quadratic or other function with compounding marginal cost
> may be preferred.
> 

Sorry but I fail to see how a linear identity transform between block
size and difficulty would work.

The miner's reward for finding a block is the sum of subsidy and fees:

 R = S + F

The probability that the miner will find a block over a time interval is
inversely proportional to the difficulty D:

 P = K / D

where K is a constant that depends on the miner's hashrate. The expected
reward of the miner is:

 E = P * R

Consider that the miner chooses a new difficulty:

 D' = D(1 + x).

With a linear identity transform between block size and difficulty, the
miner will be allowed to collect fees from a block of size: S'=S(1+x)

In the best case, collected will be proportional to block size:

 F' = F(1+x)

Thus we get:

 E' = P' * R' = K/(D(1+x)) * (S + F(1+x))

 E' = E - x/(1+x) * S * K / D

So with this linear identity transform, increasing block size never
increases the miners gain. As long as the subsidy exists, the best
strategy for miners is to reduce block size (i.e. to choose x<0).

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Gregory Maxwell
On Sun, May 10, 2015 at 9:21 PM, Gavin Andresen  wrote:
> a while I think any algorithm that ties difficulty to block size is just a
> complicated way of dictating minimum fees.

Thats not the long term effect or the motivation-- what you're seeing
is that the subsidy gets in the way here.  Consider how the procedure
behaves with subsidy being negligible compared to fees.   What it
accomplishes in that case is that it incentivizes increasing the size
until the marginal "value" to miners of the transaction-data being
left out is not enormously smaller than the "value" of the data in the
block on average.  Value in quotes because it's blind to the "fees"
the transaction claims.

With a large subsidy, the marginal value of the first byte in the
block is HUGE; and so that pushes up the average-- and creates the
"base fee effect" that you're looking at.  It's not that anyone is
picking a fee there, it's that someone picked the subsidy there.  :)
As the subsidy goes down the only thing fees are relative to is fees.

An earlier version of the proposal took subsidy out of the picture
completely by increasing it linearly with the increased difficulty;
but that creates additional complexity both to implement and to
explain to people (e.g. that the setup doesn't change the supply of
coins); ... I suppose without it that starting disadvantage parameter
(the offset that reduces the size if you're indifferent) needs to be
much smaller, unfortunately.

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Gavin Andresen
Let me make sure I understand this proposal:

On Fri, May 8, 2015 at 11:36 PM, Gregory Maxwell  wrote:

> (*) I believe my currently favored formulation of general dynamic control
> idea is that each miner expresses in their coinbase a preferred size
> between some minimum (e.g. 500k) and the miner's effective-maximum;
> the actual block size can be up to the effective maximum even if the
> preference is lower (you're not forced to make a lower block because you
> stated you wished the limit were lower).  There is a computed maximum
> which is the 33-rd percentile of the last 2016 coinbase preferences
> minus computed_max/52 (rounding up to 1) bytes-- or 500k if thats
> larger. The effective maximum is X bytes more, where X on the range
> [0, computed_maximum] e.g. the miner can double the size of their
> block at most. If X > 0, then the miners must also reach a target
> F(x/computed_maximum) times the bits-difficulty; with F(x) = x^2+1  ---
> so the maximum penalty is 2, with a quadratic shape;  for a given mempool
> there will be some value that maximizes expected income.  (obviously all
> implemented with precise fixed point arithmetic).   The percentile is
> intended to give the preferences of the 33% least preferring miners a
> veto on increases (unless a majority chooses to soft-fork them out). The
> minus-comp_max/52 provides an incentive to slowly shrink the maximum
> if its too large-- x/52 would halve the size in one year if miners
> were doing the lowest difficulty mining. The parameters 500k/33rd,
> -computed_max/52 bytes, and f(x)  I have less strong opinions about;
> and would love to hear reasoned arguments for particular parameters.
>

I'm going to try to figure out how much transaction fee a transaction would
have to pay to bribe a miner to include it. Greg, please let me know if
I've misinterpreted the proposed algorithm. And everybody, please let me
know if I'm making a bone-headed mistake in how I'm computing anything:

Lets say miners are expressing a desire for 600,000 byte blocks in their
coinbases.

computed_max = 600,000 - 600,000/52 = 588,462 bytes.
  --> this is about 23 average-size (500-byte) transactions less than
600,000.
effective_max = 1,176,923

Lets say I want to maintain status quo at 600,000 bytes; how much penalty
do I have?
((600,000-588,462)/588,462)^2 + 1 = 1.00038

How much will that cost me?
The network is hashing at 310PetaHash/sec right now.
Takes 600 seconds to find a block, so 186,000PH per block
186,000 * 0.00038 = 70 extra PH

If it takes 186,000 PH to find a block, and a block is worth 25.13 BTC
(reward plus fees), that 70 PH costs:
(25.13 BTC/block / 186,000 PH/block) * 70 PH = 0.00945 BTC
or at $240 / BTC:  $2.27

... so average transaction fee will have to be about ten cents ($2.27
spread across 23 average-sized transactions) for miners to decide to stay
at 600K blocks. If they fill up 588,462 bytes and don't have some
ten-cent-fee transactions left, they should express a desire to create a
588,462-byte-block and mine with no penalty.

Is that too much?  Not enough?  Average transaction fees today are about 3
cents per transaction.
I created a spreadsheet playing with the parameters:

https://docs.google.com/spreadsheets/d/1zYZfb44Uns8ai0KnoQ-LixDwdhqO5iTI3ZRcihQXlgk/edit?usp=sharing

"We" could tweak the constants or function to get a transaction fee we
think is reasonable... but we really shouldn't be deciding whether
transaction fees are too high, too low, or just right, and after thinking
about this for a while I think any algorithm that ties difficulty to block
size is just a complicated way of dictating minimum fees.

As for some other dynamic algorithm: OK with me. How do we get consensus on
what the best algorithm is? I'm ok with any "don't grow too quickly, give
some reasonable-percentage-minority of miners the ability to block further
increases."

Also relevant here:
"The curious task of economics is to demonstrate to men how little they
really know about what they imagine they can design." - Friedrich August
von Hayek

-- 
--
Gavin Andresen
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] A way to create a fee market even without a block size limit (2013)

2015-05-10 Thread Gregory Maxwell
On Sun, May 10, 2015 at 8:45 PM, Sergio Lerner
 wrote:
> Can the system by gamed?

Users can pay fees or a portion of fees out of band to miner(s); this
is undetectable to the network.

It's also behavior that miners have engaged in since at least 2011 (in
two forms;  treating transactions that paid them directly via outputs
as having that much more in fees;  and taking contracts for fast
processing for identified transactions (e.g. address matching or via
an API) e.g. "I'll pay you x at the end of the month for each of my
transactions you process, you can poll this API". I'm aware of at
least two companies having had this arrangement with miners).

I think what you suggested then just further rewards this behavior as
it allows bypassing your controls.-- I suspect generally any scheme
the looks at the fee values has this property.

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] A way to create a fee market even without a block size limit (2013)

2015-05-10 Thread Peter Todd
On Sun, May 10, 2015 at 05:45:32PM -0300, Sergio Lerner wrote:
> Two years ago I presented a new way to create a fee market that does not
> depend on the block chain limit.



> Solution: Require that the set of fees collected in a block has a
> dispersion below a threshold. Use, for example, the Coefficient of
> Variation (http://en.wikipedia.org/wiki/Coefficient_of_variation). If
> the CoVar is higher than a fixed threshold, the block is considered invalid.

It's not possible to create consensus rules enforcing anything about
fees because it's trivial to pay miners out of band.

For instance, you can pay transaction fees by including anyone-can-spend
outputs in your transactions. The miner creating the block then simply
adds a transaction at the end of their block collecting all the
anyone-can-spend outputs. Equally, if you try to prohibit that - e.g. by
preventing respending of funds in the same block - they can simply
publish fee addresses and have people put individual outputs for those
addresses in their transactions. (IIRC Eligius gave people the option to
pay fees that way for awhile)

-- 
'peter'[:-1]@petertodd.org
0fa57b40dc86a61d35aaf9241c86f047ef6f4bab8f13dfb7


signature.asc
Description: Digital signature
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


[Bitcoin-development] A way to create a fee market even without a block size limit (2013)

2015-05-10 Thread Sergio Lerner
Two years ago I presented a new way to create a fee market that does not
depend on the block chain limit.

This proposal has not been formally analyzed in any paper since then,
but I think it holds a good promise to untangle the current problem
regarding increasing the tps and creating the fee market. BTW, think the
maximum tps should be increased, but not by increasing the block size,
but by increasing the block rate (I'll expose why in my next e-mail).

The original post is here (I was overly optimistic back then):
https://bitcointalk.org/index.php?topic=147124.msg1561612#msg1561612

I'll summarize it here again, with a little editing and a few more
questions at the end:

The idea is simple, but requires a hardfork, but is has minimum impact
in the code and in the economics.

Solution: Require that the set of fees collected in a block has a
dispersion below a threshold. Use, for example, the Coefficient of
Variation (http://en.wikipedia.org/wiki/Coefficient_of_variation). If
the CoVar is higher than a fixed threshold, the block is considered invalid.

The Coefficient of variation is computed as the standard deviation over
the mean value, so it's very easy to compute. (if the mean is zero, we
assume CoVar=0). Note that the CoVar function *does not depend on the
scale*, so is just what a coin with a floating price requires.

This means that if there are many transactions containing high fees in a
block, then free transactions cannot be included.
The core devs should tweak the transaction selection algorithm to take
into account this maximum bound.

*Example*

If the transaction fee set is: 0,0,0,0,5,5,6,7,8,7
The CoVar is 0.85
Suppose we limit the CoVar to a maximum of 1.

Suppose the transaction fee set is: 0,0,0,0,0,0,0,0,0,10
Then the CoVar is 3.0

In this case the miner should have to either drop the "10" from the fee
set or drop the zeros. Obviously the miner will drop some zeros, and
choose the set: 0,10, that has a CoVar of 1.

*Why it reduces the Tx spamming Problem?*

Using this little modification, spamming users would require to use
higher fees, only if the remaining users in the community rises their
fees. And miners won't be able to include an enormous amounts of
spamming txs.

*Why it helps solving **the tragedy-of-the-commons fee "problem"?*

As miners are forced to keep the CoVar below the threshold, if people
rises the fees to confirm faster than spamming txs, automatically
smamming txs become less likely to appear in blocks, and fee-estimators
will automatically increase future fees, creating a the desired feedback
loop.

*Why it helps solving the block size problem?*

Because if we increase the block size, miners that do not care about the
fee market won't be able to fill the block with spamming txs and destroy
the market that is being created. This is not a solution against an
attacker-miner, which can always fill the block with transactions.

*Can the system by gamed? Can it be attacked?*

I don't think so. An attacker would need to spend a high amount in fees
to prevent transactions with low fees to be included in a block.
However, a formal analysis would be required. Miller, Gun Sirer, Eyal..
Want to give it a try?
*
Can create a positive feedback to a rise the fees to the top or push
fess to the bottom?

*Again, I don't think so. This depends on the dynamics between the each
node's fee estimator and the transaction backlog. MIT guys?

*Doesn't it force miners to run more complex algorithms (such as linear
programming) to find the optimum tx subset ?

*Yes, but I don't see it as a drawback, but as a positive stimulus for
researchers to develop better tx selection algorithms. Anyway, the
greedy algorithm of picking the transactions with highest fees fees
would be good enough.

*
PLEASE don't confuse the acronym CoVar I used here with co-variance.*

Best regard,
  Sergio.



--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Mark Friedenbach
Micropayment channels are not pie in the sky proposals. They work today on
Bitcoin as it is deployed without any changes. People just need to start
using them.
On May 10, 2015 11:03, "Owen Gunden"  wrote:

> On 05/08/2015 11:36 PM, Gregory Maxwell wrote:
> > Another related point which has been tendered before but seems to have
> > been ignored is that changing how the size limit is computed can help
> > better align incentives and thus reduce risk.  E.g. a major cost to the
> > network is the UTXO impact of transactions, but since the limit is blind
> > to UTXO impact a miner would gain less income if substantially factoring
> > UTXO impact into its fee calculations; and without fee impact users have
> > little reason to optimize their UTXO behavior.
>
> Along the lines of aligning incentives with a diversity of costs to a
> variety of network participants, I am curious about reactions to Justus'
> general approach:
>
>
> http://bitcoinism.liberty.me/2015/02/09/economic-fallacies-and-the-block-size-limit-part-2-price-discovery/
>
> I realize it relies on pie-in-the-sky ideas like micropayment channels,
> but I wonder if it's a worthy long-term ideal direction for this stuff.
>
>
> --
> One dashboard for servers and applications across Physical-Virtual-Cloud
> Widest out-of-the-box monitoring support with 50+ applications
> Performance metrics, stats and reports that give you Actionable Insights
> Deep dive visibility with transaction tracing using APM Insight.
> http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
> ___
> Bitcoin-development mailing list
> Bitcoin-development@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bitcoin-development
>
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] Proposed alternatives to the 20MB step function

2015-05-10 Thread Owen Gunden
On 05/08/2015 11:36 PM, Gregory Maxwell wrote:
> Another related point which has been tendered before but seems to have
> been ignored is that changing how the size limit is computed can help
> better align incentives and thus reduce risk.  E.g. a major cost to the
> network is the UTXO impact of transactions, but since the limit is blind
> to UTXO impact a miner would gain less income if substantially factoring
> UTXO impact into its fee calculations; and without fee impact users have
> little reason to optimize their UTXO behavior.

Along the lines of aligning incentives with a diversity of costs to a 
variety of network participants, I am curious about reactions to Justus' 
general approach:

http://bitcoinism.liberty.me/2015/02/09/economic-fallacies-and-the-block-size-limit-part-2-price-discovery/

I realize it relies on pie-in-the-sky ideas like micropayment channels, 
but I wonder if it's a worthy long-term ideal direction for this stuff.

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] A suggestion for reducing the size of the UTXO database

2015-05-10 Thread Bob McElrath
That's a lot of work, a lot of extra utxo's, and a lot of blockchain spam, just
so I can do a convoluted form of arithmetic on my balance.

If a tx contained an explicit miner fee and a change address, but did not
compute the change, letting the network compute it (and therefore merge
transactions spending the same utxo), could one add some form of ring signature
a la Dash to alleviate the worsened privacy implications?

Jeff Garzik [jgar...@bitpay.com] wrote:
> This has been frequently explored on IRC.
> 
> My general conclusion is "dollar bills" - pick highly common denominations of
> bitcoins.  Aggregate to obtain these denominations, but do not aggregate
> further.
> 
> This permits merge avoidance (privacy++), easy coinjoin where many hide in the
> noise (privacy++), wallet dust de-fragmentation, while avoiding the
> over-aggregation problem where you have consolidated down to one output.
> 
> Thus a wallet would have several consolidation targets.
> 
> Another strategy is simply doubling outputs.  Say you pay 0.1 BTC to
> Starbucks.  Add another 0.1 BTC output to yourself, and a final change 
> output. 
> Who can say which output goes to Starbucks?
> 
> There are many iterations and trade-offs between fragmentation and privacy.



--
Cheers, Bob McElrath

"The individual has always had to struggle to keep from being overwhelmed by
the tribe.  If you try it, you will be lonely often, and sometimes frightened.
But no price is too high to pay for the privilege of owning yourself." 
-- Friedrich Nietzsche

--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development


Re: [Bitcoin-development] A suggestion for reducing the size of the UTXO database

2015-05-10 Thread Jeff Garzik
This has been frequently explored on IRC.

My general conclusion is "dollar bills" - pick highly common denominations
of bitcoins.  Aggregate to obtain these denominations, but do not aggregate
further.

This permits merge avoidance (privacy++), easy coinjoin where many hide in
the noise (privacy++), wallet dust de-fragmentation, while avoiding the
over-aggregation problem where you have consolidated down to one output.

Thus a wallet would have several consolidation targets.

Another strategy is simply doubling outputs.  Say you pay 0.1 BTC to
Starbucks.  Add another 0.1 BTC output to yourself, and a final change
output.  Who can say which output goes to Starbucks?

There are many iterations and trade-offs between fragmentation and privacy.









On Sun, May 10, 2015 at 9:35 AM, Bob McElrath 
wrote:

> This is my biggest headache with practical bitcoin usage. I'd love to hear
> it if
> anyone has any clever solutions to the wallet/utxo locked problem. Spending
> unconfirmed outputs really requires a different security model on the part
> of
> the receiver than #confirmations, but isn't inherently bad if the receiver
> has a
> better security model and knows how to compute the probability that an
> unconfirmed-spend will get confirmed. Of course the bigger problem is
> wallet
> software that refuses to spend unconfirmed outputs.
>
> I've thought a bit about a fork/merge design: if the change were computed
> by the
> network instead of the submitter, two transactions having the same change
> address and a common input could be straightforwardly merged or split (in a
> reorg), where with bitcoin currently it would be considered a
> double-spend.  Of
> course that has big privacy implications since it directly exposes the
> change
> address, and is a hard fork, but is much closer to what people expect of a
> debit-based "account" in traditional banking.
>
> The fact of the matter is that having numerous sequential debits on an
> account
> is an extremely common use case, and bitcoin is obtuse in this respect.
>
> On May 9, 2015 1:09:32 PM EDT, Jim Phillips  wrote:
> >Forgive me if this idea has been suggested before, but I made this
> >suggestion on reddit and I got some feedback recommending I also bring
> >it
> >to this list -- so here goes.
> >
> >I wonder if there isn't perhaps a simpler way of dealing with UTXO
> >growth.
> >What if, rather than deal with the issue at the protocol level, we deal
> >with it at the source of the problem -- the wallets. Right now, the
> >typical
> >wallet selects only the minimum number of unspent outputs when building
> >a
> >transaction. The goal is to keep the transaction size to a minimum so
> >that
> >the fee stays low. Consequently, lots of unspent outputs just don't get
> >used, and are left lying around until some point in the future.
> >
> >What if we started designing wallets to consolidate unspent outputs?
> >When
> >selecting unspent outputs for a transaction, rather than choosing just
> >the
> >minimum number from a particular address, why not select them ALL? Take
> >all
> >of the UTXOs from a particular address or wallet, send however much
> >needs
> >to be spent to the payee, and send the rest back to the same address or
> >a
> >change address as a single output? Through this method, we should wind
> >up
> >shrinking the UTXO database over time rather than growing it with each
> >transaction. Obviously, as Bitcoin gains wider adoption, the UTXO
> >database
> >will grow, simply because there are 7 billion people in the world, and
> >eventually a good percentage of them will have one or more wallets with
> >spendable bitcoin. But this idea could limit the growth at least.
> >
> >The vast majority of users are running one of a handful of different
> >wallet
> >apps: Core, Electrum; Armory; Mycelium; Breadwallet; Coinbase; Circle;
> >Blockchain.info; and maybe a few others. The developers of all these
> >wallets have a vested interest in the continued usefulness of Bitcoin,
> >and
> >so should not be opposed to changing their UTXO selection algorithms to
> >one
> >that reduces the UTXO database instead of growing it.
> >
> >>From the miners perspective, even though these types of transactions
> >would
> >be larger, the fee could stay low. Miners actually benefit from them in
> >that it reduces the amount of storage they need to dedicate to holding
> >the
> >UTXO. So miners are incentivized to mine these types of transactions
> >with a
> >higher priority despite a low fee.
> >
> >Relays could also get in on the action and enforce this type of
> >behavior by
> >refusing to relay or deprioritizing the relay of transactions that
> >don't
> >use all of the available UTXOs from the addresses used as inputs.
> >Relays
> >are not only the ones who benefit the most from a reduction of the UTXO
> >database, they're also in the best position to promote good behavior.
> >
> >--
> >*James G. Phillips IV*
> >
> >
> >

Re: [Bitcoin-development] A suggestion for reducing the size of the UTXO database

2015-05-10 Thread Bob McElrath
This is my biggest headache with practical bitcoin usage. I'd love to hear it if
anyone has any clever solutions to the wallet/utxo locked problem. Spending
unconfirmed outputs really requires a different security model on the part of
the receiver than #confirmations, but isn't inherently bad if the receiver has a
better security model and knows how to compute the probability that an
unconfirmed-spend will get confirmed. Of course the bigger problem is wallet
software that refuses to spend unconfirmed outputs.

I've thought a bit about a fork/merge design: if the change were computed by the
network instead of the submitter, two transactions having the same change
address and a common input could be straightforwardly merged or split (in a
reorg), where with bitcoin currently it would be considered a double-spend.  Of
course that has big privacy implications since it directly exposes the change
address, and is a hard fork, but is much closer to what people expect of a
debit-based "account" in traditional banking.

The fact of the matter is that having numerous sequential debits on an account
is an extremely common use case, and bitcoin is obtuse in this respect.

On May 9, 2015 1:09:32 PM EDT, Jim Phillips  wrote:
>Forgive me if this idea has been suggested before, but I made this
>suggestion on reddit and I got some feedback recommending I also bring
>it
>to this list -- so here goes.
>
>I wonder if there isn't perhaps a simpler way of dealing with UTXO
>growth.
>What if, rather than deal with the issue at the protocol level, we deal
>with it at the source of the problem -- the wallets. Right now, the
>typical
>wallet selects only the minimum number of unspent outputs when building
>a
>transaction. The goal is to keep the transaction size to a minimum so
>that
>the fee stays low. Consequently, lots of unspent outputs just don't get
>used, and are left lying around until some point in the future.
>
>What if we started designing wallets to consolidate unspent outputs?
>When
>selecting unspent outputs for a transaction, rather than choosing just
>the
>minimum number from a particular address, why not select them ALL? Take
>all
>of the UTXOs from a particular address or wallet, send however much
>needs
>to be spent to the payee, and send the rest back to the same address or
>a
>change address as a single output? Through this method, we should wind
>up
>shrinking the UTXO database over time rather than growing it with each
>transaction. Obviously, as Bitcoin gains wider adoption, the UTXO
>database
>will grow, simply because there are 7 billion people in the world, and
>eventually a good percentage of them will have one or more wallets with
>spendable bitcoin. But this idea could limit the growth at least.
>
>The vast majority of users are running one of a handful of different
>wallet
>apps: Core, Electrum; Armory; Mycelium; Breadwallet; Coinbase; Circle;
>Blockchain.info; and maybe a few others. The developers of all these
>wallets have a vested interest in the continued usefulness of Bitcoin,
>and
>so should not be opposed to changing their UTXO selection algorithms to
>one
>that reduces the UTXO database instead of growing it.
>
>>From the miners perspective, even though these types of transactions
>would
>be larger, the fee could stay low. Miners actually benefit from them in
>that it reduces the amount of storage they need to dedicate to holding
>the
>UTXO. So miners are incentivized to mine these types of transactions
>with a
>higher priority despite a low fee.
>
>Relays could also get in on the action and enforce this type of
>behavior by
>refusing to relay or deprioritizing the relay of transactions that
>don't
>use all of the available UTXOs from the addresses used as inputs.
>Relays
>are not only the ones who benefit the most from a reduction of the UTXO
>database, they're also in the best position to promote good behavior.
>
>--
>*James G. Phillips IV*
>
>
>*"Don't bunt. Aim out of the ball park. Aim for the company of
>immortals."
>-- David Ogilvy*
>
>*This message was created with 100% recycled electrons. Please think
>twice
>before printing.*
>
>
>!DSPAM:554e4e5450787476022393!
>
>
>
>
>--
>One dashboard for servers and applications across
>Physical-Virtual-Cloud 
>Widest out-of-the-box monitoring support with 50+ applications
>Performance metrics, stats and reports that give you Actionable
>Insights
>Deep dive visibility with transaction tracing using APM Insight.
>http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
>
>!DSPAM:554e4e5450787476022393!
>
>
>
>
>___
>Bitcoin-development mailing list
>Bitcoin-development@lists.sourceforge.net
>https://lists.sourceforge.net/lists/list

Re: [Bitcoin-development] A suggestion for reducing the size of the UTXO database

2015-05-10 Thread Jim Phillips
I feel your pain. I've had the same thing happen to me in the past. And I
agree it's more likely to occur with my proposed scheme but I think with HD
wallets there will still be UTXOs left unspent after most transactions
since, for privacy sake it's looking for the smallest set of addresses that
can be linked.
On May 9, 2015 9:11 PM, "Matt Whitlock"  wrote:

> Minimizing the number of UTXOs in a wallet is sometimes not in the best
> interests of the user. In fact, quite often I've wished for a configuration
> option like "Try to maintain _[number]_ UTXOs in the wallet." This is
> because I often want to make multiple spends from my wallet within one
> block, but spends of unconfirmed inputs are less reliable than spends of
> confirmed inputs, and some wallets (e.g., Andreas Schildbach's wallet)
> don't even allow it - you can only spend confirmed UTXOs. I can't tell you
> how aggravating it is to have to tell a friend, "Oh, oops, I can't pay you
> yet. I have to wait for the last transaction I did to confirm first." All
> the more aggravating because I know, if I have multiple UTXOs in my wallet,
> I can make multiple spends within the same block.
>
>
> On Saturday, 9 May 2015, at 12:09 pm, Jim Phillips wrote:
> > Forgive me if this idea has been suggested before, but I made this
> > suggestion on reddit and I got some feedback recommending I also bring it
> > to this list -- so here goes.
> >
> > I wonder if there isn't perhaps a simpler way of dealing with UTXO
> growth.
> > What if, rather than deal with the issue at the protocol level, we deal
> > with it at the source of the problem -- the wallets. Right now, the
> typical
> > wallet selects only the minimum number of unspent outputs when building a
> > transaction. The goal is to keep the transaction size to a minimum so
> that
> > the fee stays low. Consequently, lots of unspent outputs just don't get
> > used, and are left lying around until some point in the future.
> >
> > What if we started designing wallets to consolidate unspent outputs? When
> > selecting unspent outputs for a transaction, rather than choosing just
> the
> > minimum number from a particular address, why not select them ALL? Take
> all
> > of the UTXOs from a particular address or wallet, send however much needs
> > to be spent to the payee, and send the rest back to the same address or a
> > change address as a single output? Through this method, we should wind up
> > shrinking the UTXO database over time rather than growing it with each
> > transaction. Obviously, as Bitcoin gains wider adoption, the UTXO
> database
> > will grow, simply because there are 7 billion people in the world, and
> > eventually a good percentage of them will have one or more wallets with
> > spendable bitcoin. But this idea could limit the growth at least.
> >
> > The vast majority of users are running one of a handful of different
> wallet
> > apps: Core, Electrum; Armory; Mycelium; Breadwallet; Coinbase; Circle;
> > Blockchain.info; and maybe a few others. The developers of all these
> > wallets have a vested interest in the continued usefulness of Bitcoin,
> and
> > so should not be opposed to changing their UTXO selection algorithms to
> one
> > that reduces the UTXO database instead of growing it.
> >
> > >From the miners perspective, even though these types of transactions
> would
> > be larger, the fee could stay low. Miners actually benefit from them in
> > that it reduces the amount of storage they need to dedicate to holding
> the
> > UTXO. So miners are incentivized to mine these types of transactions
> with a
> > higher priority despite a low fee.
> >
> > Relays could also get in on the action and enforce this type of behavior
> by
> > refusing to relay or deprioritizing the relay of transactions that don't
> > use all of the available UTXOs from the addresses used as inputs. Relays
> > are not only the ones who benefit the most from a reduction of the UTXO
> > database, they're also in the best position to promote good behavior.
> >
> > --
> > *James G. Phillips IV*
> > 
> >
> > *"Don't bunt. Aim out of the ball park. Aim for the company of
> immortals."
> > -- David Ogilvy*
> >
> >  *This message was created with 100% recycled electrons. Please think
> twice
> > before printing.*
>
--
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y___
Bitcoin-development mailing list
Bitcoin-development@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bitcoin-development