Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Jorge Timón via bitcoin-dev
Really, thanks again for replying and not getting mad when I get your
thoughts wrong.
I believe that I've learned more about your position on the subject
today than in months of discussion and blogs (that's not a critique to
your blog post, it's just that they didn't answer to some questions
that I personally needed responded).

On Thu, Aug 6, 2015 at 9:42 PM, Gavin Andresen gavinandre...@gmail.com wrote:
 On Thu, Aug 6, 2015 at 1:15 PM, Jorge Timón jti...@jtimon.cc wrote:

 So I reformulate the question:

 1) If not now, when will it be a good time to let the market
 minimum fee for miners to mine a transaction rise above zero?


 Two answers:

 1. If you are willing to wait an infinite amount of time, I think the
 minimum fee will always be zero or very close to zero, so I think it's a
 silly question.

I'm very happy to have made the stupid question then. It has revealed
another big difference in the fundamental assumptions we're using.

My assumption is that for any reasonable size, free transactions will
eventually disappear (assuming Bitcoin doesn't fail for some other
reason).
Maybe I'm being too optimistic about the demand side of the market in
the long term.

In contrast, your assumption seems to be (and please correct me on
anything I get wrong) that...

The limit will always be big enough so that free transactions are
mined forever. Therefore fees just allow users to prioritize their
urgent transactions and relay policies to protect their nodes against
DoS attacks.
Well, obviously, they also serve to pay for mining in a low-subsidy
future, but even with the presence of free transactions, fees will be
enough to cover mining costs, or a new mechanisms will be developed to
make a low-total-reward blockchain safe or expensive proof of work
will be replaced or complemented with something else that's cheaper.
The main point is that fees are not a mechanism to decide what gets
priced out of the blockchain, because advancements in technology will
always give as enough room for free transactions.
- jtimon putting words in Gavin's mouth, with the only intention to
understand him better.

I'm using free transactions even though you said zero or very close to zero.
To you, zero or very close to zero may be the same thing, but to me
zero and very close to zero are like...different galaxies.
To me, entering the very close to zero galaxy is a huge step in the
development of the fee market.
I've been always assuming that moving from zero to 1 satoshi was
precisely what big block advocates wanted to avoid.
What they meant by Bitcoin is going to become a high-value only
network and similar things.
Knowing that for big block advocates zero and very close to zero
are equally acceptable changes things.

 2. The market minimum fee should be determined by the market. It should
 not be up to us to decide when is a good time.

I completely agree, but the block size limit is a consensus rule that
doesn't adapt to the market. The market will adapt to whatever limit
is chosen by the consensus rules.

 2) Do you have any criterion (automatic or not) that can result in you
 saying no, this is too much for any proposed size?


 Sure, if keeping up with transaction volume requires a cluster of computers
 or more than pretty good broadband bandwidth I think that's too far.
 That's where original 20MB limit comes from, otherwise I'd have proposed a
 much higher limit.

 Would you agree that blocksize increase proposals should have such a
 criterion/test?


 Although I've been very clear with my criterion, no, I don't think all
 blocksize increase proposals should have to justify why this size or why
 this rate of increase.

I would really like a more formal criterion, ideally automatic (like
any other test, the parameters can be modified as technology
advances).
But fair enough, even though your criterion is too vague or not
future-proof enough, I guess it is still a criterion.
It seems that this is a matter of disagreements and ideal ways of
doing things and not really a disagreement on fundamental assumptions.
So it seems this question wasn't so interesting after all.

 Part of my frustration with this whole debate is
 we're talking about a sanity-check upper-limit; as long as it doesn't open
 up some terrible new DoS possibility I don't think it really matters much
 what the exact number is.

That's what you think you are discussing, but I (and probably some
other people) think we are discussing something entirely different.
Because we have a fundamentally different assumption on what the block
size limit is about.
I really hope that identifying these fundamental assumption
discrepancies (FAD from now own) will help us avoid circular
discussions so that everything is less frustrating and more productive
for everyone.

 Regardless of the history of the consensus rule (which I couldn't care
 less about), I believe the only function that the maximum block size
 rule currently serves is limiting centralization.
 Since you deny 

Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Pieter Wuille via bitcoin-dev
On Aug 6, 2015 9:42 PM, Gavin Andresen via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:
2. The market minimum fee should be determined by the market. It should
not be up to us to decide when is a good time.

I partially agree. The community should decide what risks it is willing to
take, and set limits accordingly. Let the market decide how that space is
best used.



 Would you agree that blocksize increase proposals should have such a
 criterion/test?


 Although I've been very clear with my criterion, no, I don't think all
blocksize increase proposals should have to justify why this size or why
this rate of increase. Part of my frustration with this whole debate is
we're talking about a sanity-check upper-limit; as long as it doesn't open
up some terrible new DoS possibility I don't think it really matters much
what the exact number is.

It is only a DoS protection limit if you want to rely on trusting miners. I
prefer a system where I don't have to do that.

But I agree the numbers don't matter much, for a different reason: the
market will fill up whatever space is available, and we'll have the same
discussion when the new limit doesn't seem enough anymore.

-- 
Pieter
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Gavin Andresen via bitcoin-dev
On Thu, Aug 6, 2015 at 1:15 PM, Jorge Timón jti...@jtimon.cc wrote:

 So I reformulate the question:

 1) If not now, when will it be a good time to let the market
 minimum fee for miners to mine a transaction rise above zero?


Two answers:

1. If you are willing to wait an infinite amount of time, I think the
minimum fee will always be zero or very close to zero, so I think it's a
silly question.

2. The market minimum fee should be determined by the market. It should
not be up to us to decide when is a good time.


 2) Do you have any criterion (automatic or not) that can result in you
 saying no, this is too much for any proposed size?


Sure, if keeping up with transaction volume requires a cluster of computers
or more than pretty good broadband bandwidth I think that's too far.
That's where original 20MB limit comes from, otherwise I'd have proposed a
much higher limit.


 Would you agree that blocksize increase proposals should have such a
 criterion/test?


Although I've been very clear with my criterion, no, I don't think all
blocksize increase proposals should have to justify why this size or why
this rate of increase. Part of my frustration with this whole debate is
we're talking about a sanity-check upper-limit; as long as it doesn't open
up some terrible new DoS possibility I don't think it really matters much
what the exact number is.



 Regardless of the history of the consensus rule (which I couldn't care
 less about), I believe the only function that the maximum block size
 rule currently serves is limiting centralization.
 Since you deny that function, do you think the (artificial) consensus
 rule is currently serving any other purpose that I'm missing?


It prevents trivial denial-of-service attacks (e.g. I promise to send you a
1 Terabyte block, then fill up your memory or disk...).

And please read what I wrote: I said that the block limit has LITTLE effect
on MINING centralization.  Not no effect on any type of centralization.

If the limit was removed entirely, it is certainly possible we'd end up
with very few organizations (and perhaps zero individuals) running full
nodes.

-- 
--
Gavin Andresen
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Peter Todd via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256



On 6 August 2015 10:21:54 GMT-04:00, Gavin Andresen via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:
On Thu, Aug 6, 2015 at 10:06 AM, Pieter Wuille
pieter.wui...@gmail.com
wrote:

 But you seem to consider that a bad thing. Maybe saying that you're
 claiming that this equals Bitcoin failing is an exaggeration, but you
do
 believe that evolving towards an ecosystem where there is competition
for
 block space is a bad thing, right?


No, competition for block space is good.

What is bad is artificially limiting or centrally controlling the
supply of
that space.

Incidentally, why is that competition good? What specific design goal is that 
competition achieving?

-BEGIN PGP SIGNATURE-

iQE9BAEBCAAnIBxQZXRlciBUb2RkIDxwZXRlQHBldGVydG9kZC5vcmc+BQJVw9fn
AAoJEMCF8hzn9Lnc47AH/idUy2rGlUCBTTU/jDpNjMy5VGYYRawx50lrnGBufvIJ
8ZbFleI+gbnFCaJiaPF9ZN0mTjFWv7YcFzlwoPam11UfhEYI2Cl1aGha+R7g/18t
+1256i4Ykg0uEqrX9ITpYyzoBsVMaqsaOGBbJbUUtHoD1V1GCYBYi5JAl1msGjH/
2o+/Gh7gBB1Ll6SPtgeM1cCudRXA7PJr3WTjkLy8oGKY7lmVsPUfQ7h3OBJMTwa5
B+i1KTpSWdWyciWk0a3z7cxNfaajd7Pj3jZYoeCzKJdZja7lnB7FzUnaPE3y0wse
Bby6w48R4VeYsVhM+GolvRDVVSQN9XNfRSiRjuMW4Eg=
=wzhz
-END PGP SIGNATURE-

___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Elliot Olds via bitcoin-dev
On Wed, Aug 5, 2015 at 6:26 PM, Jorge Timón jti...@jtimon.cc wrote:


 Given that for any non-absurdly-big size some transactions will
 eventually be priced out, and that the consensus rule serves for
 limiting mining centralization (and more indirectly centralization in
 general) and not about trying to set a given average transaction fee,
 I think the current level of mining centralization will always be more
 relevant than the current fee level when discussing any change to the
 consensus rule to limit centralization (at any point in time).
 In other words, the question can we change this without important
 risks of destroying the decentralized properties of the system in the
 short or long run? should be always more important than is there a
 concerning rise in fees to motivate this change at all?.


I agree with you that decentralization is the most important feature of
Bitcoin, but I also think we need to think probabilistically and concretely
about when risks to decentralization are worthwhile.

Decentralization is not infinitely valuable in relation to low fees, just
like being alive is not infinitely valuable in relation to having money.
For instance, people will generally not accept a 100% probability of death
in exchange for any amount of money. However anyone who understands
probability and has the preferences of a normal person would play a game
where they accept a one in a billion chance of instant death to win one
billion dollars if they don't die.

Similarly we shouldn't accept a 100% probability of Bitcoin being
controlled by a single entity for any guarantee of cheap tx fees no matter
how low they are, but there should be some minuscule risk of
decentralization that we'd be willing to accept (like raising the block
size to 1.01 MB) if it somehow allowed us to dramatically increase
usability. (Imagine something like the Lightning Network but even better
was developed, but it could only work with 1.01 MB blocks).



  Jorge, if a fee equilibrium developed at 1MB of $5/tx, and you somehow
 knew
  with certainty that increasing to 4MB would result in a 20 cent/tx
  equilibrium that would last for a year (otherwise fees would stay around
 $5
  for that year), would you be in favor of an increase to 4MB?

 As said, I would always consider the centralization risks first: I'd
 rather have a $5/tx decentralized Bitcoin than a Bitcoin with free
 transactions but effectively validated (when they validate blocks they
 mine on top of) by around 10 miners, specially if only 3 of them could
 easily collude to censor transactions [orphaning any block that
 doesn't censor in the same manner]. Sadly I have no choice, the later
 is what we have right now. And reducing the block size can't guarantee
 that the situation will get better or even that fees could rise to
 $5/tx (we just don't have that demand, not that it is a goal for
 anyone). All I know is that increasing the block size *could*
 (conditional, not necessarily, I don't know in which cases, I don't
 think anybody does) make things even worse.


I agree that we don't have good data about what exactly a 4 MB increase
would do. It sounds like you think the risks are too great / uncertain to
move from 1 MB to 4 MB blocks in the situation I described. I'm not clear
though on which specific risks you'd be most worried about at 4 MB, and if
there are any risks that you think don't matter at 4 MB but that you would
be worried about at higher block size levels. I also don't know if we have
similar ideas about the benefits of low tx fees. If we discussed exactly
how we were evaluating this scenario, maybe we'd discover that something I
thought was a huge benefit of low tx fees is actually not that compelling,
or maybe we'd discover that our entire disagreement boiled down to our
estimate of one specific risk.

For the record, I think these are the main harms of $5 tx fees, along with
the main risks I see from moving to 4 MB:

Fees of $5/tx would:
(a) Prevent a lot of people who could otherwise benefit from Bitcoin's
decentralization from having an opportunity to reap those benefits.
Especially people in poor countries with corrupt governments who could get
immediate benefit from it now.
(b) Prevent developers from experimenting with new Bitcoin use-cases, which
might eventually lead to helpful services.
(c) Prevent regular people from using Bitcoin and experimenting with it and
getting involved, because they think it's unusable for txns under many
hundreds of dollars in value, so it doesn't interest them. Not having the
public on our side could make us more vulnerable to regulators.

Changing the block size to 4 MB would:
(1) Probably reduce the number of full nodes by around 5%. Most of the drop
in full nodes over the past few years is probably due to Bitcoin Core being
used by fewer regular users for convenience reasons, but the extra HD space
required and extra bandwidth would probably make some existing people
running full nodes stop.
(2) Not hinder 

Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Jorge Timón via bitcoin-dev
First of all, thank you very much for answering the questions, and
apologies for not having formulated them properly (fortunately that's
not an irreparable mistake).

On Thu, Aug 6, 2015 at 6:03 PM, Gavin Andresen gavinandre...@gmail.com wrote:
 On Thu, Aug 6, 2015 at 11:25 AM, Jorge Timón jti...@jtimon.cc wrote:

 1) If not now when will it be a good time to let fees rise above zero?


 Fees are already above zero. See
 http://gavinandresen.ninja/the-myth-of-not-full-blocks

When we talk about fees we're talking about different things. I
should have been more specific.
Average fees are greatly influenced by wallet and policy defaults, and
they also include extra fees that are included for fast confirmation.
I'm not talking about fast confirmation transactions, but about
non-urgent transactions.

What is the market minimum fee for miners to mine a transaction?

That's currently zero.
If you don't want to directly look at what blocks contain, we can also
use a fee estimator and define a non-urgent period, say 1 week worth
of blocks (1008 blocks).
The chart in your link doesn't include a 1008 blocks line, but the 15
blocks (about 2.5 hours) line seems to already show zero fees:

http://img.svbtle.com/p4x3s7fn52sz9a.png

So I reformulate the question:

1) If not now, when will it be a good time to let the market
minimum fee for miners to mine a transaction rise above zero?

 2) When will you consider a size to be too dangerous for centralization?
 In other words, why 20 GB would have been safe but 21 GB wouldn't have
 been (or the respective maximums and respective +1 for each block
 increase proposal)?


 http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-centralized

This just shows where the 20 GB come from, not why you would reject 21 GB.
Let me rephrase.

2) Do you have any criterion (automatic or not) that can result in you
saying no, this is too much for any proposed size?
Since you don't think the consensus block size maximum limits mining
centralization (as you later say), it must be based on something else.
In any case, if you lack a criterion that's fine as well: it's never
too late to have one.
Would you agree that blocksize increase proposals should have such a
criterion/test?

 3) Does this mean that you would be in favor of completely removing
 the consensus rule that limits mining centralization by imposing an
 artificial (like any other consensus rule) block size maximum?


 I don't believe that the maximum block size has much at all to do with
 mining centralization, so I don't accept the premise of the question.

Ok, this is an enormous step forward in the discussion, thank you.
In my opinion all discussions will be sterile while we can't even
agree on what are the positive effects of the consensus rule that
supposedly needs to be changed.

It's not that you don't care about centralization, it's that you don't
believe that a consensus block size maximum limits centralization at
all.
This means that if I can convince you that the consensus block size
maximum does in fact limit centralization in any way, you may change
your views about the whole blocksize consensus rule change, you may
even take back or change your own proposal.
But let's leave that aside that for now.

Regardless of the history of the consensus rule (which I couldn't care
less about), I believe the only function that the maximum block size
rule currently serves is limiting centralization.
Since you deny that function, do you think the (artificial) consensus
rule is currently serving any other purpose that I'm missing?

If the answer is something along the lines of not really, it's just
technical debt, then I think you should be honest and consequent, and
directly advocate for the complete removal of the consensus rule.

I really think conversations can't really advance until we clarify the
different positions about the discussed consensus rule current
purpose.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Pieter Wuille via bitcoin-dev
On Thu, Aug 6, 2015 at 3:40 PM, Gavin Andresen via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:

 On Wed, Aug 5, 2015 at 9:26 PM, Jorge Timón 
 bitcoin-dev@lists.linuxfoundation.org wrote:

 This is a much more reasonable position. I wish this had been starting
 point of this discussion instead of the block size limit must be
 increased as soon as possible or bitcoin will fail.


 It REALLY doesn't help the debate when you say patently false statements
 like that.

 My first blog post on this issue is here:
   http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent

 ... and I NEVER say Bitcoin will fail.  I say:

 If the number of transactions waiting gets large enough, the end result
 will be an over-saturated network, busy doing nothing productive. I don’t
 think that is likely– it is more likely people just stop using Bitcoin
 because transaction confirmation becomes increasingly unreliable.


But you seem to consider that a bad thing. Maybe saying that you're
claiming that this equals Bitcoin failing is an exaggeration, but you do
believe that evolving towards an ecosystem where there is competition for
block space is a bad thing, right?

I don't agree that Not everyone is able to use the block chain for every
use case is the same thing as People stop using Bitcoin. People are
already not using it for every use case.

Here is what my proposed BIP says: No hard forking change that relaxes the
block size limit can be guaranteed to provide enough space for every
possible demand - or even any particular demand - unless strong
centralization of the mining ecosystem is expected. Because of that, the
development of a fee market and the evolution towards an ecosystem that is
able to cope with block space competition should be considered healthy.

-- 
Pieter
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Gavin Andresen via bitcoin-dev
On Wed, Aug 5, 2015 at 9:26 PM, Jorge Timón 
bitcoin-dev@lists.linuxfoundation.org wrote:

 This is a much more reasonable position. I wish this had been starting
 point of this discussion instead of the block size limit must be
 increased as soon as possible or bitcoin will fail.


It REALLY doesn't help the debate when you say patently false statements
like that.

My first blog post on this issue is here:
  http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent

... and I NEVER say Bitcoin will fail.  I say:

If the number of transactions waiting gets large enough, the end result
will be an over-saturated network, busy doing nothing productive. I don’t
think that is likely– it is more likely people just stop using Bitcoin
because transaction confirmation becomes increasingly unreliable.

Mike sketched out the worst-case here:
  https://medium.com/@octskyward/crash-landing-f5cc19908e32

... and concludes:

I believe there are no situations in which Bitcoin can enter an overload
situation and come out with its reputation and user base intact. Both would
suffer heavily and as Bitcoin is the founder of the cryptocurrency concept,
the idea itself would inevitably suffer some kind of negative
repercussions.




So please stop with the over-the-top claims about what the other side
believe, there are enough of those (on both sides of the debate) on reddit.
I'd really like to focus on how to move forward, and how best to resolve
difficult questions like this in the future.

-- 
--
Gavin Andresen
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Gavin Andresen via bitcoin-dev
On Thu, Aug 6, 2015 at 10:06 AM, Pieter Wuille pieter.wui...@gmail.com
wrote:

 But you seem to consider that a bad thing. Maybe saying that you're
 claiming that this equals Bitcoin failing is an exaggeration, but you do
 believe that evolving towards an ecosystem where there is competition for
 block space is a bad thing, right?


No, competition for block space is good.

What is bad is artificially limiting or centrally controlling the supply of
that space.

-- 
--
Gavin Andresen
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Pieter Wuille via bitcoin-dev
On Thu, Aug 6, 2015 at 4:21 PM, Gavin Andresen gavinandre...@gmail.com
wrote:

 On Thu, Aug 6, 2015 at 10:06 AM, Pieter Wuille pieter.wui...@gmail.com
 wrote:

 But you seem to consider that a bad thing. Maybe saying that you're
 claiming that this equals Bitcoin failing is an exaggeration, but you do
 believe that evolving towards an ecosystem where there is competition for
 block space is a bad thing, right?


 No, competition for block space is good.


So if we would have 8 MB blocks, and there is a sudden influx of users (or
settlement systems, who serve much more users) who want to pay high fees
(let's say 20 transactions per second) making the block chain inaccessible
for low fee transactions, and unreliable for medium fee transactions (for
any value of low, medium, and high), would you be ok with that? If so, why
is 8 MB good but 1 MB not? To me, they're a small constant factor that does
not fundamentally improve the scale of the system. I dislike the outlook of
being forever locked at the same scale while technology evolves, so my
proposal tries to address that part. It intentionally does not try to
improve a small factor, because I don't think it is valuable.

What is bad is artificially limiting or centrally controlling the supply of
 that space.


It's exactly as centrally limited as the finite supply of BTC - by
consensus. You and I may agree that a finite supply is a good thing, and
may disagree about whether a consensus rule about the block size is a good
idea (and if so, at what level), but it's a choice we make as a community
about the rules of the system we want to use.

-- 
Pieter
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Mike Hearn via bitcoin-dev
Whilst 1mb to 8mb might seem irrelevant from a pure computer science
perspective payment demand is not really infinite, at least not if by
payment we mean something resembling how current Bitcoin users use the
network.

If we define payment to mean the kind of thing that Bitcoin users and
enthusiasts have been doing up until now, then suddenly 1mb to 8mb makes a
ton of sense and doesn't really seem that small: we'd have to increase
usage by nearly an order of magnitude before it becomes an issue again!

If we think of Bitcoin as a business that serves customers, growing our
user base by an order of magnitude would be a great and celebration worthy
achievement! Not at all a small constant factor :)

And keeping the current user base happy and buying things is extremely
interesting, both to me and Gavin. Without users Bitcoin is nothing at all.
Not a settlement network, not anything.

It's actually going to be quite hard to grow that much. As the white paper
says, the system works well enough for most transactions. And despite a
lot of effort by many people, killer apps that use Bitcoin's unique
features are still hit and miss. Perhaps Streamium, Lighthouse, ChangeTip,
some distributed exchange or something else will stimulate huge new demand
for transactions in future . but if so we're not there yet.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Tom Harding via bitcoin-dev
On 8/6/2015 7:53 AM, Pieter Wuille via bitcoin-dev wrote:

 So if we would have 8 MB blocks, and there is a sudden influx of users
 (or settlement systems, who serve much more users) who want to pay
 high fees (let's say 20 transactions per second) making the block
 chain inaccessible for low fee transactions, and unreliable for medium
 fee transactions (for any value of low, medium, and high), would you
 be ok with that?

Gavin has answered this question in the clearest way possible -- in
tested C++ code, which increases capacity only on a precise schedule for
20 years, then stops increasing.


___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Jorge Timón via bitcoin-dev
On Thu, Aug 6, 2015 at 3:40 PM, Gavin Andresen gavinandre...@gmail.com wrote:
 On Wed, Aug 5, 2015 at 9:26 PM, Jorge Timón
 bitcoin-dev@lists.linuxfoundation.org wrote:

 This is a much more reasonable position. I wish this had been starting
 point of this discussion instead of the block size limit must be
 increased as soon as possible or bitcoin will fail.


 It REALLY doesn't help the debate when you say patently false statements
 like that.

I'm pretty sure that I can quote Mike Hearn with a sentence extremely
similar to that in this forum or in some of his blog posts (not in
https://medium.com/@octskyward/crash-landing-f5cc19908e32 at a first
glance...).
But yeah, what people said in the past is not very important: people
change their minds (they even acknowledge their mistake some times).
What interests me more it's what people think now.

I don't want to put words in your mouth and you are more than welcome
to correct what I think you think with what you really think.
All I'm trying to do is framing your fears properly.
If I say all fears related to not raising the block size limit in the
short term can be summarized as a fear of fees rising in the short
term.
Am I correct? Am I missing some other argument?
Of course, problems that need to be solved regardless of the block
size (like an unbounded mempool) should not be considered for this
discussion.

 My first blog post on this issue is here:
   http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent

 ... and I NEVER say Bitcoin will fail.  I say:

 If the number of transactions waiting gets large enough, the end result
 will be an over-saturated network, busy doing nothing productive. I don’t
 think that is likely– it is more likely people just stop using Bitcoin
 because transaction confirmation becomes increasingly unreliable.

If you pay high enough fees your transactions will be likely mined in
the next block.
So this seems to be reducible to the fees rising concern unless I am
missing something.

 So please stop with the over-the-top claims about what the other side
 believe, there are enough of those (on both sides of the debate) on reddit.
 I'd really like to focus on how to move forward, and how best to resolve
 difficult questions like this in the future.

I think I would have a much better understanding of what the other
side thinks if I ever got an answer to a couple of very simple
questions I have been repeating ad nausea:

1) If not now when will it be a good time to let fees rise above zero?

2) When will you consider a size to be too dangerous for centralization?
In other words, why 20 GB would have been safe but 21 GB wouldn't have
been (or the respective maximums and respective +1 for each block
increase proposal)?

On Thu, Aug 6, 2015 at 4:21 PM, Gavin Andresen gavinandre...@gmail.com wrote:
 What is bad is artificially limiting or centrally controlling the supply of
 that space.

3) Does this mean that you would be in favor of completely removing
the consensus rule that limits mining centralization by imposing an
artificial (like any other consensus rule) block size maximum?

I've been insistently repeating this question too.
Admittedly, it would be a great disappointment if your answer to this
question is yes: that can only mean that either you don't understand
how the consensus rule limits mining centralization or that you don't
care about mining centralization at all.

If you really want things to move forward, please, prove it by
answering these questions so that we don't have to imagine what the
answers are (because what we imagine is probably much worse than your
actual answers).
I'm more than willing to stop trying to imagine what big block
advocates think, but I need your answers from the big block
advocates.

Asking repeatedly doesn't seem to be effective. So I will answer the
questions myself in the worse possible way I think a big block
advocate could answer them.
Feel free to replace my stupid answers with your own:

--- (FICTION ANSWERS [given the lack of real answers])

3) Does this mean that you would be in favor of completely removing
the consensus rule that limits mining centralization by imposing an
artificial (like any other consensus rule) block size maximum?

Yes, I would remove the rule because I don't care about mining centralization.

2) When will you consider a size to be too dangerous for centralization?
In other words, why 20 GB would have been safe but 21 GB wouldn't have
been (or the respective maximums and respective +1 for each block
increase proposal)?

Never, as said I don't care about mining centralization.
I thought users and Bitcoin companies would agree with a 20 GB limit
hardfork with proper lobbying, but I certainly prefer 21 GB.
From 1 MB to infinity, the bigger the better, always.

1) If not now when will it be a good time to let fees rise above zero?

Never. Fees are just an excuse, the real goal is making Bitcoin centralized.


Re: [bitcoin-dev] Block size following technological growth

2015-08-06 Thread Gavin Andresen via bitcoin-dev
On Thu, Aug 6, 2015 at 11:25 AM, Jorge Timón jti...@jtimon.cc wrote:

 1) If not now when will it be a good time to let fees rise above zero?


Fees are already above zero. See
http://gavinandresen.ninja/the-myth-of-not-full-blocks


 2) When will you consider a size to be too dangerous for centralization?
 In other words, why 20 GB would have been safe but 21 GB wouldn't have
 been (or the respective maximums and respective +1 for each block
 increase proposal)?


http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-centralized

3) Does this mean that you would be in favor of completely removing
 the consensus rule that limits mining centralization by imposing an
 artificial (like any other consensus rule) block size maximum?


I don't believe that the maximum block size has much at all to do with
mining centralization, so I don't accept the premise of the question.

-- 
--
Gavin Andresen
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-05 Thread Elliot Olds via bitcoin-dev
On Tue, Aug 4, 2015 at 4:59 AM, Jorge Timón 
bitcoin-dev@lists.linuxfoundation.org wrote:

 Also I don't think hitting the limit must be necessarily harmful and
 if it is, I don't understand why hitting it at 1MB will be more
 harmful than hitting it at 2MB, 8MB or 8GB.


I don't think merely hitting the limit is bad. The level of tx fees in
equilibrium give some clue as to the level of harm being done. If fees are
at $5/tx at 1MB, it's about as bad as if fees are at $5/tx at 4MB.

There is NO criterion based on mining centralization to decide between
 2 sizes in favor of the small one.
 It seems like the rationale it's always the bigger the better and
 the only limitation is what a few people concerned with mining
 centralization (while they still have time to discuss this) are
 willing to accept. If that's the case, then there won't be effectively
 any limit in the long term and Bitcoin will probably fail in its
 decentralization goals.
 I think its the proponents of a blocksize change who should propose
 such a criterion and now they have the tools to simulate different
 block sizes.


In the absence of harder data, it might be interesting to use these
simulations to graph centralization pressure as a function of bandwidth
cost over time (or other historical variables that affect centralization).
For instance, look at how high centralization pressure was in 2009
according to the simulations, given how cheap/available bandwidth was then,
compared to 2010, 2011, etc. Then we could figure out: given today's
bandwidth situation, what size blocks right now would give us the same
centralization pressure that we had in 2011, 2012, 2013, etc?

Of course this doesn't mean we should blindly assume that the level of
centralization pressure in 2012 was acceptable, and therefore any block
size increase that results in the same amount of pressure now should be
acceptable. But it might lead to a more productive discussion.


 I want us to simulate many blocksizes before rushing into a decision
 (specially because I disagree that taking a decision there is urgent
 in the first place).


IMO it is not urgent if the core devs are committed to reacting to a huge
spike in tx fees with a modest block size increase in a relatively short
time frame, if they judge the centralization risks of that increase to be
small. Greg Maxwell posted on reddit a while back something to the effect
of the big block advocates are overstating the urgency of the block size
increase, because if there was actually a situation that required us to
increase block size, we could make the increase when it was actually
needed. I found that somewhat persuasive, but I am concerned that I
haven't seen any discussion of what the let's wait for now camp would
consider a valid reason to increase block size in the short term, and how
they'd make the tradeoff with tx fees or whatever else was necessitating
the increase.

Jorge, if a fee equilibrium developed at 1MB of $5/tx, and you somehow knew
with certainty that increasing to 4MB would result in a 20 cent/tx
equilibrium that would last for a year (otherwise fees would stay around $5
for that year), would you be in favor of an increase to 4MB?

For those familiar with the distinction between near/far mode thinking
popularized by Robin Hanson: focusing on concrete examples like this
encourages problem solving and consensus, and focusing on abstract
principles (like decentralization vs. usability in general) leads to people
toward using argument to signal their alliances and reduce the status of
their opponents. I think it'd be very helpful if more 1MB advocates
described what exactly would make them say OK, in this situation a block
size increase is needed, we should do one quickly!, and also if
Gavin/Mike/Jeff described what hypothetical scenarios and/or test results
would make them want to stick with 1MB blocks for now.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-05 Thread Gareth Williams via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA512



On 4 August 2015 11:12:36 PM AEST, Gavin Andresen via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:
On Tue, Aug 4, 2015 at 7:27 AM, Pieter Wuille via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:

 I would say that things already demonstrately got terrible. The
mining
 landscape is very centralized, with apparently a majority depending
on
 agreements to trust each other's announced blocks without validation.

And that is a problem... why?

With all due respect Gavin, large-block advocates appear to hold the position 
that:
* pushing individual economic actors away from running full nodes is a natural 
and unproblematic consequence of block size increase, as they're expected to 
rely on SPV
You now also appear to hold the position that:
* pushing miners to SPV mining is unproblematic

Have I misunderstood? Is one of these not an expected outcome of large blocks? 
I can understand the validity of either argument alone -- the assertion that we 
can trust miners to validate and just trust most-POW ourselves, /or/ the 
assertion that lack of miner validation is safe under certain circumstances. 
But together?

Who do you expect to actually validate large blocks if not miners, and what do 
you expect their incentive to do so to be?

-BEGIN PGP SIGNATURE-
Version: APG v1.1.1

iQFABAEBCgAqBQJVwcYAIxxHYXJldGggV2lsbGlhbXMgPGdhY3J1eEBnbWFpbC5j
b20+AAoJEEY5w2E3jkVEEWQH/Aty47q71H/ZcMMX/6qcTpOumI9h/buUfsvYA2H+
J6Al5S8JvCy3/0yMFCLmqolHoxOdWu5jwtUf/w2fepgA1RJyZItFo1EG9cJB0Cpz
JgM+2s4L/F3l4+Gea2ddjhvE5JqGS0Qh3EWaR/xy1bouq0FZjtDendmK7vFRj/oS
Gowm+g5EFBiT1XwCQQXJc9k0RxzDpPQ0ouSOXWdPUuxfQAjPyX89eQBeQzgVDtEf
zVfROZVHQuBu85rKTd32TdCNLQ0oEhAYmwgdtmJyLLgieeqHmNbaikBaVDEOvixn
S+fmnfD8CVeeXo5zKdlLZXazc5geRx97H4NMBMjTyjPzR5k=
=WG0I
-END PGP SIGNATURE-

___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-05 Thread Jorge Timón via bitcoin-dev
On Wed, Aug 5, 2015 at 9:29 AM, Elliot Olds elliot.o...@gmail.com wrote:
 On Tue, Aug 4, 2015 at 4:59 AM, Jorge Timón
 bitcoin-dev@lists.linuxfoundation.org wrote:

 Also I don't think hitting the limit must be necessarily harmful and
 if it is, I don't understand why hitting it at 1MB will be more
 harmful than hitting it at 2MB, 8MB or 8GB.


 I don't think merely hitting the limit is bad. The level of tx fees in
 equilibrium give some clue as to the level of harm being done. If fees are
 at $5/tx at 1MB, it's about as bad as if fees are at $5/tx at 4MB.

This is a much more reasonable position. I wish this had been starting
point of this discussion instead of the block size limit must be
increased as soon as possible or bitcoin will fail.
If the only fear about not increasing the block size fast enough is
that fees may rise, pieter wouldn't had to repeatedly explain that for
any reasonable block size some use case may fill the blocks rapidly.

 There is NO criterion based on mining centralization to decide between
 2 sizes in favor of the small one.
 It seems like the rationale it's always the bigger the better and
 the only limitation is what a few people concerned with mining
 centralization (while they still have time to discuss this) are
 willing to accept. If that's the case, then there won't be effectively
 any limit in the long term and Bitcoin will probably fail in its
 decentralization goals.
 I think its the proponents of a blocksize change who should propose
 such a criterion and now they have the tools to simulate different
 block sizes.


 In the absence of harder data, it might be interesting to use these
 simulations to graph centralization pressure as a function of bandwidth cost
 over time (or other historical variables that affect centralization). For
 instance, look at how high centralization pressure was in 2009 according to
 the simulations, given how cheap/available bandwidth was then, compared to
 2010, 2011, etc. Then we could figure out: given today's bandwidth
 situation, what size blocks right now would give us the same centralization
 pressure that we had in 2011, 2012, 2013, etc?

 Of course this doesn't mean we should blindly assume that the level of
 centralization pressure in 2012 was acceptable, and therefore any block size
 increase that results in the same amount of pressure now should be
 acceptable. But it might lead to a more productive discussion.

This sounds good overall but I'm afraid you are oversimplifying some things.
Centralization pressure not only comes from global average bandwidth
costs and block propagation times is not the only concern.
Here's an extreme example: [1]
But anyway, yes, I agree, ANY metric would be better than nothing (the
current situation).

 I want us to simulate many blocksizes before rushing into a decision
 (specially because I disagree that taking a decision there is urgent
 in the first place).


 IMO it is not urgent if the core devs are committed to reacting to a huge
 spike in tx fees with a modest block size increase in a relatively short
 time frame, if they judge the centralization risks of that increase to be
 small. Greg Maxwell posted on reddit a while back something to the effect of
 the big block advocates are overstating the urgency of the block size
 increase, because if there was actually a situation that required us to
 increase block size, we could make the increase when it was actually
 needed. I found that somewhat persuasive, but I am concerned that I haven't
 seen any discussion of what the let's wait for now camp would consider a
 valid reason to increase block size in the short term, and how they'd make
 the tradeoff with tx fees or whatever else was necessitating the increase.

Given that for any non-absurdly-big size some transactions will
eventually be priced out, and that the consensus rule serves for
limiting mining centralization (and more indirectly centralization in
general) and not about trying to set a given average transaction fee,
I think the current level of mining centralization will always be more
relevant than the current fee level when discussing any change to the
consensus rule to limit centralization (at any point in time).
In other words, the question can we change this without important
risks of destroying the decentralized properties of the system in the
short or long run? should be always more important than is there a
concerning rise in fees to motivate this change at all?.

 Jorge, if a fee equilibrium developed at 1MB of $5/tx, and you somehow knew
 with certainty that increasing to 4MB would result in a 20 cent/tx
 equilibrium that would last for a year (otherwise fees would stay around $5
 for that year), would you be in favor of an increase to 4MB?

As said, I would always consider the centralization risks first: I'd
rather have a $5/tx decentralized Bitcoin than a Bitcoin with free
transactions but effectively validated (when they validate blocks they
mine on top of) by around 

Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Hector Chu via bitcoin-dev
Mike's position is that he wants the block size limit
to eventually be removed. That is of course an extreme view. Meanwhile,
your view that the block size should be artificially constrained below the
organic growth curve (in a way that will penalize a majority of existing
and future users) lies at the other extreme. The majority position lies
somewhere in between (i.e. a one-time increase to 8MB). This is the
position that ultimately matters.

If the block size is increased to 8MB and things get demonstrably a whole
lot worse, then you will have a solid leg to stand on. In that case we can
always do another hard fork later to reduce the block size back to
something smaller, and henceforth the block size will never be touched
again.

On 4 August 2015 at 11:35, Jorge Timón 
bitcoin-dev@lists.linuxfoundation.org wrote:

 On Fri, Jul 31, 2015 at 4:58 PM, Mike Hearn he...@vinumeris.com wrote:
  How more users or more nodes can bring more miners, or more importantly,
  improve mining decentralization?
 
 
  Because the bigger the ecosystem is the more interest there is in taking
  part?

 As explained by Venzen, this is a non-sequitur.

  I mean, I guess I don't know how to answer your question.

 I don't know the answer either, that's fine. It's the opposite
 question that I've been insistently repeating and you've been
 (consciously or not) consistently evading.
 But that's also fine because I believe you finally answer it a few lines
 below.

  When Bitcoin was
  new it had almost no users and almost no miners. Now there are millions
 of
  users and factories producing ASICs just for Bitcoin.

 The emergence of a btc price enabled the emergence of professional
 miners, which in turn enabled the emergence of sha256d-specialized
 hardware production companies.
 Nothing surprising there.
 By no means it consitutes an example of how a bigger consensus sizes
 can cause less mining centralization.

  Surely the correlation is obvious?

 Correlation does not imply causation. I will better leave it at that...

  I'm sorry, but until there's a simulation that I can run with different
  sizes' testchains (for example using #6382) to somehow compare them, I
 will
  consider any value arbitrary.
 
 
  Gavin did run simulations. 20mb isn't arbitrary, the process behind it
 was
  well documented here:
 
 
 http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-centralized
 
  I chose 20MB as a reasonable block size to target because 170 gigabytes
 per
  month comfortably fits into the typical 250-300 gigabytes per month data
  cap– so you can run a full node from home on a “pretty good” broadband
 plan.
 
  Did you think 20mb was picked randomly?

 No, I think 20 MB was chosen very optimistically, considering 3rd
 party services rates (not the same service as self-hosting) in the
 so-called first world. And then 20 MB goes to 20 GB, again with
 optimistic and by no means scientific expectations.

 But where the number comes from it's not really what I'm demaning,
 what I want is some criterion that can tell you that a given size
 would be too centralized but another one isn't.
 I haven't read any analysis on why 8GB is a better option than 7GB and
 9GB for a given criterion (nor one declaring 20 GB a winner over 19 GB
 or 21 GB).
 A simulation test passing 20 GB but not 21 GB would make it far less
 arbitrary.

  Agreed on the first sentence, I'm just saying that the influence of
  the blocksize in that function is monotonic: with bigger sizes, equal
  or worse mining centralization.
 
 
  I have a hard time agreeing with this because I've seen Bitcoin go from
  blocks that were often empty to blocks that are often full, and in this
 time
  the number of miners and hash power on the network has gone up a huge
 amount
  too.

 I'm of course talking about consensus maximum blocksize, not about
 actual blocksize.
 Yes, again, when mining becomes profitable, economic actors tend to
 appear and get those profits.
 But don't confuse total hashrate improvements with an increase in the
 number of miners or with mining decentralization.

  You can argue that a miner doesn't count if they pool mine. But if a
 miner
  mines on a pool that uses exactly the same software and settings as the
  miner would have done anyway, then it makes no difference. Miners can
 switch
  between pools to find one that works the way they like, so whilst less
  pooling or more decentralised pools would be nice (e.g.
 getblocktemplate),
  and I've written about how to push it forward before, I still say there
 are
  many more miners than in the past.
 
  If I had to pick between two changes to improve mining decentralisation:
 
  1) Lower block size

 Finally, I think you finally answered my repetitive question here.
 If I say Mike Hearn understands that the consensus block size maximum
 rule is a tool for limitting mining centralization I'm not putting
 words in your mouth, right?
 I think many users advocating for an increase in the 

Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Pieter Wuille via bitcoin-dev
On Tue, Aug 4, 2015 at 3:12 PM, Gavin Andresen gavinandre...@gmail.com
wrote:

 On Tue, Aug 4, 2015 at 7:27 AM, Pieter Wuille via bitcoin-dev 
 bitcoin-dev@lists.linuxfoundation.org wrote:

 I would say that things already demonstrately got terrible. The mining
 landscape is very centralized, with apparently a majority depending on
 agreements to trust each other's announced blocks without validation.

 And that is a problem... why?


If miners need to form alliances of trusting each other's blocks without
validation to overcome the inefficiencies of slow block propagation, I
think we have a system that is in direct conflict with the word
permissionless that you use later.


 As Bitcoin grows, pieces of the ecosystem will specialize. Satoshi's
 original code did everything: hashing, block assembly, wallet, consensus,
 network. That is changing, and that is OK.


Specialization is perfectly fine.

 I believe that if the above would have happened overnight, people would
 have cried wolf. But somehow it happened slow enough, and things kept
 working.

 I don't think that this is a good criterion. Bitcoin can work with
 gigabyte blocks today, if everyone uses the same few blockchain validation
 services, the same few online wallets, and mining is done by a cartel that
 only allows joining after signing a contract so they can sue you if you
 create an invalid block. Do you think people will then agree that things
 got demonstratebly worse?

 Don't turn Bitcoin into something uninteresting, please.

Why is what you, personally, find interesting relevant?


I find it interesting to build a system that has potential to bring about
innovation.

I understand you want to build an extremely decentralized system, where
 everybody participating trusts nothing except the genesis block hash.


That is not true, I'm sorry if that is the impression I gave.

I see centralization and scalability as a trade-off, and for better or for
worse, the block chain only offers one trade-off. I want to see technology
built on top that introduces lower levels of trust than typical fully
centralized systems, while offering increased convenience, speed,
reliability, and scale. I just don't think that all of that can happen on
the lowest layer without hurting everything built on top. We need different
trade-offs, and the blockchain is just one, but a very fundamental one.

I think it is more interesting to build a system that works for hundreds of
 millions of people, with no central point of control and the opportunity
 for ANYBODY to participate at any level. Permission-less innovation is what
 I find interesting.


That sounds amazing, but do you think that Bitcoin, as it exists today, can
scale to hundreds of millions of users, while retaining any glimpse of
permission-lessness and decentralization? I think we need low-trust
off-chain systems and other innovations to make that happen.


 And I think the current demonstrably terrible Bitcoin system is still
 INCREDIBLY interesting.


I'm happy for you, then.

-- 
Pieter
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Hector Chu via bitcoin-dev
Things apparently aren't bad enough to prevent the majority from clamoring
for larger blocks.

If the majority agreed that things had got worse till this point, and that
this was to be blamed on the block size, they would be campaigning for the
other direction. Even yourselves aren't asking for a reduction in the block
size, as you know full well that you would be laughed out.

On 4 August 2015 at 12:27, Pieter Wuille pieter.wui...@gmail.com wrote:

 I would say that things already demonstrately got terrible. The mining
 landscape is very centralized, with apparently a majority depending on
 agreements to trust each other's announced blocks without validation. Full
 node count is at its historically lowest value in years, and outsourcing of
 full validation keeps growing.

 I believe that if the above would have happened overnight, people would
 have cried wolf. But somehow it happened slow enough, and things kept
 working.

 I don't think that this is a good criterion. Bitcoin can work with
 gigabyte blocks today, if everyone uses the same few blockchain validation
 services, the same few online wallets, and mining is done by a cartel that
 only allows joining after signing a contract so they can sue you if you
 create an invalid block. Do you think people will then agree that things
 got demonstratebly worse?

 Don't turn Bitcoin into something uninteresting, please.

 --
 Pieter
 On Aug 4, 2015 1:04 PM, Hector Chu via bitcoin-dev 
 bitcoin-dev@lists.linuxfoundation.org wrote:

 Mike's position is that he wants the block size limit
 to eventually be removed. That is of course an extreme view. Meanwhile,
 your view that the block size should be artificially constrained below the
 organic growth curve (in a way that will penalize a majority of existing
 and future users) lies at the other extreme. The majority position lies
 somewhere in between (i.e. a one-time increase to 8MB). This is the
 position that ultimately matters.

 If the block size is increased to 8MB and things get demonstrably a whole
 lot worse, then you will have a solid leg to stand on. In that case we can
 always do another hard fork later to reduce the block size back to
 something smaller, and henceforth the block size will never be touched
 again.

 On 4 August 2015 at 11:35, Jorge Timón 
 bitcoin-dev@lists.linuxfoundation.org wrote:

 On Fri, Jul 31, 2015 at 4:58 PM, Mike Hearn he...@vinumeris.com wrote:
  How more users or more nodes can bring more miners, or more
 importantly,
  improve mining decentralization?
 
 
  Because the bigger the ecosystem is the more interest there is in
 taking
  part?

 As explained by Venzen, this is a non-sequitur.

  I mean, I guess I don't know how to answer your question.

 I don't know the answer either, that's fine. It's the opposite
 question that I've been insistently repeating and you've been
 (consciously or not) consistently evading.
 But that's also fine because I believe you finally answer it a few lines
 below.

  When Bitcoin was
  new it had almost no users and almost no miners. Now there are
 millions of
  users and factories producing ASICs just for Bitcoin.

 The emergence of a btc price enabled the emergence of professional
 miners, which in turn enabled the emergence of sha256d-specialized
 hardware production companies.
 Nothing surprising there.
 By no means it consitutes an example of how a bigger consensus sizes
 can cause less mining centralization.

  Surely the correlation is obvious?

 Correlation does not imply causation. I will better leave it at that...

  I'm sorry, but until there's a simulation that I can run with
 different
  sizes' testchains (for example using #6382) to somehow compare them,
 I will
  consider any value arbitrary.
 
 
  Gavin did run simulations. 20mb isn't arbitrary, the process behind it
 was
  well documented here:
 
 
 http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-centralized
 
  I chose 20MB as a reasonable block size to target because 170
 gigabytes per
  month comfortably fits into the typical 250-300 gigabytes per month
 data
  cap– so you can run a full node from home on a “pretty good” broadband
 plan.
 
  Did you think 20mb was picked randomly?

 No, I think 20 MB was chosen very optimistically, considering 3rd
 party services rates (not the same service as self-hosting) in the
 so-called first world. And then 20 MB goes to 20 GB, again with
 optimistic and by no means scientific expectations.

 But where the number comes from it's not really what I'm demaning,
 what I want is some criterion that can tell you that a given size
 would be too centralized but another one isn't.
 I haven't read any analysis on why 8GB is a better option than 7GB and
 9GB for a given criterion (nor one declaring 20 GB a winner over 19 GB
 or 21 GB).
 A simulation test passing 20 GB but not 21 GB would make it far less
 arbitrary.

  Agreed on the first sentence, I'm just saying that the influence of
  the blocksize in that 

Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Hector Chu via bitcoin-dev
On 4 August 2015 at 14:13, Jorge Timón jti...@jtimon.cc wrote:

 2) It doesn't matter who is to blame about the current centralization:
 the fact remains that the blocksize maximum is the only** consensus
 rule to limit mining centralization.


Repeating a claim ad-nauseum doesn't make it necessarily true. A block size
limit won't prevent miners in the future from buying each other out.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Jorge Timón via bitcoin-dev
On Tue, Aug 4, 2015 at 2:19 PM, Hector Chu hector...@gmail.com wrote:
 On 4 August 2015 at 12:59, Jorge Timón jti...@jtimon.cc wrote:

 That is not my position. Again, I don't know what the right blocksize
 for the short term is (I don't think anybody does).

 You have no position (i.e. neutral). In other words, keeping the existing
 limit.

No, I think 1 MB is just as arbitrary as any other size proposed.
All I want is for consensus change proponents to try harder to
convince other users (including me)

 Therefore how the change can affect mining centralization must be the
 main concern, instead of (also artificial) projections about usage
 growth (no matter how organic their curves look).


 The degree of mining decentralization is only one of many concerns. Users'
 main concern is timely confirmation of low-fee transactions. Miners' concern
 is the amount of profit they make.

No, if the changed rule only serves to limit centralization, then how
that limitation to centralization is affected should be the first
thing to consider.
If miners' concern was only the amount of profit they make they
wouldn't mine free transactions already.
You cannot possibly know what all users' are concern about, so I will
just ignore any further claim in that direction.
Talk for yourself: your arguments won't be more reasonable just
because you claim that all users think like you do.

 Also I don't think hitting the limit must be necessarily harmful and
 if it is, I don't understand why hitting it at 1MB will be more
 harmful than hitting it at 2MB, 8MB or 8GB.


 The limit won't even get to be hit, because all the users that get thrown
 out of Bitcoin will have moved over to a system supporting a larger block
 size.

I disagree with this wild prediction as well.

 I don't know where you get your majority from or what it even means
 (majority of users, majority of the coins, of miners?)


 The majority which the miners are beholden to is the economic majority.
 https://en.bitcoin.it/wiki/Economic_majority

And I assume the way that vaguely defined economic majority
communicates with you through a crystal ball or something

 But there's something I'm missing something there...why my position
 doesn't matter if it's not a majority?


 Your position is only one of many and it does not carry excess weight to the
 others. Individually it won't matter, because you can't control the
 implementation that other people run.

No more, but not less either.
Nobody can't control the implementation that I (or other people
concerned with centralization) run either.

 How is what the the majority has been told it's best an objective
 argument?


 Don't fight the market. The way the system is designed, the miners will
 follow along with what the economic majority have decided.

How is allowing fees from rising above zero fighting the market?
The system is currently designed with a 1 MB limit. I don't think
that's sacred or anything, but I really don't feel like I'm fighting
the market or the way the system is designed.
In any case, what do the market and the way the system is designed
have to do with what the majority have been told it's best (which you
seem to think should be a source of truth for some reason I'm still
missing)?

 So if you say 8, I must ask, why not 9?
 Why 9 MB is not safe for mining centralization but 8 MB is?


 8MB has simply been the focal point for this debate. 9MB is also safe if 8MB
 is, but I suppose the opponents will be even less happy with 9 than with 8,
 and we don't want to unnecessarily increase the conflict.

Why 9 MB is safe but 10 MB isn't?
The conflict won't be resolved by evading hard questions...

 It seems like the rationale it's always the bigger the better and
 the only limitation is what a few people concerned with mining
 centralization (while they still have time to discuss this) are
 willing to accept. If that's the case, then there won't be effectively
 any limit in the long term and Bitcoin will probably fail in its
 decentralization goals.


 A one-time increase to 8MB is safer than a dynamically growing limit over
 time for exactly this reason. Admittedly whenever the next debate to
 increase the block size over 8MB happens it will be even more painful and
 non-obvious, but that is the safety check to prevent unbounded block size
 increase.

Will there ever be a debate that results in further blocksize
increases at this point are very risky for mining centralization?
How will we tell then? Can't we use the same criteria now?
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Venzen Khaosan via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1



On 08/04/2015 08:28 PM, Hector Chu via bitcoin-dev wrote:
 On 4 August 2015 at 14:13, Jorge Timón jti...@jtimon.cc 
 mailto:jti...@jtimon.cc wrote:
 
 2) It doesn't matter who is to blame about the current
 centralization: the fact remains that the blocksize maximum is the
 only** consensus rule to limit mining centralization.
 
 
 Repeating a claim ad-nauseum doesn't make it necessarily true. A
 block size limit won't prevent miners in the future from buying
 each other out.
 

It plays both ways, the technical list requires a proof of concept and
simulation upon which to base judgment going forward, else we'll just
be talking out our backsides (like certain people) without making any
progress.

The tools for simulation exist: Jorge Timon created a variable
blocksize regtest PR here:
https://github.com/bitcoin/bitcoin/pull/6382

No-one needs to postulate what different blocksizes imply  - anyone
can run a simulation and demonstrate what they're talking about.

 ___ bitcoin-dev mailing
 list bitcoin-dev@lists.linuxfoundation.org 
 https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
 
-BEGIN PGP SIGNATURE-
Version: GnuPG v1

iQEcBAEBAgAGBQJVwMFEAAoJEGwAhlQc8H1mo/EH/2USw5YUL/7sgFAsjpXdpcS+
9XZ0M0AK4PNSo36GBBhjaF9rRa76FtK6Vt9nLe+7lgYmeHSkcQ65OLKfP47hCnsz
9XfVR0n7nv+0TqHQKPcjm+WNBoVndKRHGEwNQoQw//bAmO4LOcmQCMXAkk9RfaKm
4olay0nUmAFNqh/7wVOunOUFMJNIRpy/neAlFYxRAHBIJLcc0KQNiLqAHbzwPDZq
e9kLjtIusWwLUCgHFvox01bIEOx+VYIxzjMVRz1MNGyRGwDweg7zk54WA48nYwmx
70Ggdde9kiLytPDwB2ey/IRE4mv/4KS2zivJy36XAsjPExTNeGKxGeGfBqXNwSI=
=aya4
-END PGP SIGNATURE-
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Alex Morcos via bitcoin-dev
On Tue, Aug 4, 2015 at 9:12 AM, Gavin Andresen via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:

 On Tue, Aug 4, 2015 at 7:27 AM, Pieter Wuille via bitcoin-dev 
 bitcoin-dev@lists.linuxfoundation.org wrote:

 I would say that things already demonstrately got terrible. The mining
 landscape is very centralized, with apparently a majority depending on
 agreements to trust each other's announced blocks without validation.

 And that is a problem... why?

 As far as I can tell, nobody besides miners running old and/or buggy
 software lost money due to outsourced mining validation (please correct me
 if I'm wrong-- I'm looking forward to Greg's post-mortem). The operators of
 bitcoin.org seem to have freaked out and pushed the panic button (with
 dire warnings of not trusting transactions until 20 confirmations), but
 theymos was well known for using an old, patched version of Core for
 blockexplorer.com so maybe that's not surprising.


I'm also looking forward to Greg's post-mortem, because I had a completely
different takeaway from the BIP66 mini-forks.  My view is that despite the
extremely cautious and conservative planning for the completely
uncontentious fork, the damage could and would have been very significant
if it had not been for several core devs manually monitoring, intervening
and problem solving for other network participants.  I don't believe thats
the way the system should work.  Participants in the Bitcoin community have
come to rely on the devs for just making sure everything works for them.
That's not sustainable.  The system needs to be made fundamentally more
secure if its going to succeed, not depend on the good will of any
particular parties, otherwise it certainly will no longer be permissionless.

The BIP66 fork was urgently required to fix an undisclosed consensus bug,
unanimously agreed on and without technical objection, and it was still
fraught with problems.  That's the most clear cut example of when we should
have a fork.  A change to a consensus limit that a significant proportion
of the community disagrees with for economic or technical reasons or both
should be raising a sea of red flags.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Venzen Khaosan via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1



On 08/04/2015 08:12 PM, Gavin Andresen via bitcoin-dev wrote:
 On Tue, Aug 4, 2015 at 7:27 AM, Pieter Wuille via bitcoin-dev 
 bitcoin-dev@lists.linuxfoundation.org 
 mailto:bitcoin-dev@lists.linuxfoundation.org wrote:
 
 I would say that things already demonstrately got terrible. The 
 mining landscape is very centralized, with apparently a majority 
 depending on agreements to trust each other's announced blocks 
 without validation.
 
 And that is a problem... why?
 
 As far as I can tell,

[snip]


It's a big problem. What are you dismissing it for?

With Bitcoin in fledgling 0.x version state this is neither desirable
nor encouraging.

Development did not freeze at some time in the past and now we see how
the userbase reacts. Miners, btw, are arguibly still a class of user
as long as they are guaranteed coinbase.

When they start making and proving themselves useful in a free
floating fee market absent coinbase subsidy, we can revisit this
topic, with the benefit of hindsight.

 As Bitcoin grows, pieces of the ecosystem will specialize.
 Satoshi's original code did everything: hashing, block assembly,
 wallet, consensus, network. That is changing, and that is OK.

[snip]

 
 And I think the current demonstrably terrible Bitcoin system is
 still INCREDIBLY interesting.
 
Pieter never said it wasn't interesting, so this emphatic statement is
strange - like someone is trying to convince an audience - but
anyway... as you, a veritable spring-chicken by your actions and
words, said the other day: having graduated in '88 you're old and
speak from experience. Don't come with that jive, bossy man - bring
facts and testing to the technical list.

My finance readers, in one camp, and Bitcoin investors, in the other,
want to see the XT 8MB hard-fork testing data that you mentioned for
BIP100.

Being ignorant is not so much a shame, as being unwilling to learn.
- - Benjamin Franklin

 -- -- Gavin Andresen
 

Venzen Khaosan
-BEGIN PGP SIGNATURE-
Version: GnuPG v1

iQEcBAEBAgAGBQJVwMyOAAoJEGwAhlQc8H1mho4IAKEHVxE4lAs3aIoLXa2fxLP8
3q7MhfM5vIW9QAM7rjz8YzheMg3Wj2CNfZPuUV7YDTVrLZPrIN/aMY6CIftr7GUS
pjMI9nnwezFwYX5oyRU+gW51AMFhvexV6ITZYpiLRtWHgK1FZtXWMG13eO/6Jb5U
Wjflub7suMDvg+ST2PplhQf7fFmnPHrLZg3ISDqK+hvgw20geW1rXC/wCChlewfd
DqSt9fxqs+NIvbIzS2TgLTkIcHlbKNeI5AeqbaFoaIQtvYALD3Ojt2I/qoCJU1za
rB8Il7UK0B5uf6xxgErGcYAHzjVpR6Zhsdzo6MiBF1j4ClfNPEQAlG49YjrRXpI=
=4nai
-END PGP SIGNATURE-
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Gavin Andresen via bitcoin-dev
On Tue, Aug 4, 2015 at 7:27 AM, Pieter Wuille via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:

 I would say that things already demonstrately got terrible. The mining
 landscape is very centralized, with apparently a majority depending on
 agreements to trust each other's announced blocks without validation.

And that is a problem... why?

As far as I can tell, nobody besides miners running old and/or buggy
software lost money due to outsourced mining validation (please correct me
if I'm wrong-- I'm looking forward to Greg's post-mortem). The operators of
bitcoin.org seem to have freaked out and pushed the panic button (with dire
warnings of not trusting transactions until 20 confirmations), but theymos
was well known for using an old, patched version of Core for
blockexplorer.com so maybe that's not surprising.

As Bitcoin grows, pieces of the ecosystem will specialize. Satoshi's
original code did everything: hashing, block assembly, wallet, consensus,
network. That is changing, and that is OK.

I understand there are parts of the ecosystem you'd rather not see
specialized, like transaction selection / block assembly or validation. I
see it as a natural maturation. The only danger I see is if some unnatural
barriers to competition spring up.

 Full node count is at its historically lowest value in years, and
outsourcing of full validation keeps growing.

Both side effects of increasing specialization, in my opinion. Many
companies quite reasonably would rather hire somebody who specializes in
running nodes, keeping keys secure, etc rather than develop that expertise
themselves.

Again, not a problem UNLESS some unnatural barriers to competition spring
up.


 I believe that if the above would have happened overnight, people would
 have cried wolf. But somehow it happened slow enough, and things kept
 working.

 I don't think that this is a good criterion. Bitcoin can work with
 gigabyte blocks today, if everyone uses the same few blockchain validation
 services, the same few online wallets, and mining is done by a cartel that
 only allows joining after signing a contract so they can sue you if you
 create an invalid block. Do you think people will then agree that things
 got demonstratebly worse?

 Don't turn Bitcoin into something uninteresting, please.

 Why is what you, personally, find interesting relevant?

I understand you want to build an extremely decentralized system, where
everybody participating trusts nothing except the genesis block hash.

I think it is more interesting to build a system that works for hundreds of
millions of people, with no central point of control and the opportunity
for ANYBODY to participate at any level. Permission-less innovation is what
I find interesting.

And I think the current demonstrably terrible Bitcoin system is still
INCREDIBLY interesting.

-- 
--
Gavin Andresen
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Jorge Timón via bitcoin-dev
On Tue, Aug 4, 2015 at 1:34 PM, Hector Chu hector...@gmail.com wrote:
 Things apparently aren't bad enough to prevent the majority from clamoring
 for larger blocks.

Nobody is preventing anyone from claiming anything. Some developers
are encouraging users to ask for bigger blocks.
Others don't want to impose consensus rule changes against the will of
the users (even if they're 10% of the users).
Still, Things apparently aren't bad enough is just your opinion.

 If the majority agreed that things had got worse till this point, and that
 this was to be blamed on the block size, they would be campaigning for the
 other direction. Even yourselves aren't asking for a reduction in the block
 size, as you know full well that you would be laughed out.

1) I don't care what the so-called majority thinks: I don't want to
impose consensus rule changes against the will of a reasonable
minority.
2) It doesn't matter who is to blame about the current centralization:
the fact remains that the blocksize maximum is the only** consensus
rule to limit mining centralization.
3) In fact I think Luke Dashjr proposed to reduced it to 400 KB, but I
would ask the same thing: please create a simulation in which the
change is better (or at least no much worse) than the current rules by
ANY metric.

Please read the point 2 with special attention because it's not the
first time I say this in this thread.

** There's also the maximum block sigops consensus rule to limit
mining centralization.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Venzen Khaosan via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1



On 08/04/2015 06:34 PM, Hector Chu via bitcoin-dev wrote:
 Things apparently aren't bad enough to prevent the majority from 
 clamoring for larger blocks.
 
 If the majority agreed that things had got worse till this point, 
 and that this was to be blamed on the block size, they would be 
 campaigning for the other direction. Even yourselves aren't asking 
 for a reduction in the block size, as you know full well that you 
 would be laughed out.
 

Hector, if you could provide data that convinces why 8MB is better
than 6.18MB or 1MB then we'd get out of the realm of opinion and
pointless rhetoric that threatens to keep this debate in a quagmire.
We'd have actual figures to work with and projections to go by.

But fetching majority agreement (where from?) does not cut it for
setting Bitcoin on a future path. If we go by that then we'd soon be
giving coinbase rewards to users for being loyal supporters because,
as a majority, they think that's what they'd like to see.

If a proposal is demonstrably, and provably, a good idea - and a
developer consensus agrees - then it should go to testing, and
eventually, code. Other than that it's just conjecture and words
without a research paper and data.

In the final analysis, do we want Bitcoin to be steered by an
uninformed and fickle majority, or do we want to use this list as a
forum to present research proposals containing repeatable, verifiable
facts? A progressive process of convincing those most familiar with
Bitcoin's code and operation so they may implement Good Ideas during
the next century and after is surely preferable to Vote-my-code-Coin. :)


 On 4 August 2015 at 12:27, Pieter Wuille pieter.wui...@gmail.com 
 mailto:pieter.wui...@gmail.com wrote:
 
 I would say that things already demonstrately got terrible. The 
 mining landscape is very centralized, with apparently a majority 
 depending on agreements to trust each other's announced blocks 
 without validation. Full node count is at its historically lowest 
 value in years, and outsourcing of full validation keeps growing.
 
 I believe that if the above would have happened overnight, people 
 would have cried wolf. But somehow it happened slow enough, and 
 things kept working.
 
 I don't think that this is a good criterion. Bitcoin can work 
 with gigabyte blocks today, if everyone uses the same few 
 blockchain validation services, the same few online wallets, and 
 mining is done by a cartel that only allows joining after signing
 a contract so they can sue you if you create an invalid block. Do
 you think people will then agree that things got demonstratebly 
 worse?
 
 Don't turn Bitcoin into something uninteresting, please.
 
 -- Pieter
 
 On Aug 4, 2015 1:04 PM, Hector Chu via bitcoin-dev 
 bitcoin-dev@lists.linuxfoundation.org 
 mailto:bitcoin-dev@lists.linuxfoundation.org wrote:
 
 Mike's position is that he wants the block size limit to
 eventually be removed. That is of course an extreme view.
 Meanwhile, your view that the block size should be artificially
 constrained below the organic growth curve (in a way that will
 penalize a majority of existing and future users) lies at the other
 extreme. The majority position lies somewhere in between (i.e. a
 one-time increase to 8MB). This is the position that ultimately
 matters.
 
 If the block size is increased to 8MB and things get demonstrably
 a whole lot worse, then you will have a solid leg to stand on. In 
 that case we can always do another hard fork later to reduce the 
 block size back to something smaller, and henceforth the block
 size will never be touched again.
 
 On 4 August 2015 at 11:35, Jorge Timón 
 bitcoin-dev@lists.linuxfoundation.org 
 mailto:bitcoin-dev@lists.linuxfoundation.org wrote:
 
 On Fri, Jul 31, 2015 at 4:58 PM, Mike Hearn he...@vinumeris.com 
 mailto:he...@vinumeris.com wrote:
 How more users or more nodes can bring more miners, or more 
 importantly, improve mining decentralization?
 
 
 Because the bigger the ecosystem is the more interest there is
 in taking part?
 
 As explained by Venzen, this is a non-sequitur.
 
 I mean, I guess I don't know how to answer your question.
 
 I don't know the answer either, that's fine. It's the opposite 
 question that I've been insistently repeating and you've been 
 (consciously or not) consistently evading. But that's also fine 
 because I believe you finally answer it a few lines below.
 
 When Bitcoin was new it had almost no users and almost no
 miners. Now there are millions of users and factories producing
 ASICs just for Bitcoin.
 
 The emergence of a btc price enabled the emergence of professional
  miners, which in turn enabled the emergence of sha256d-specialized
  hardware production companies. Nothing surprising there. By no 
 means it consitutes an example of how a bigger consensus sizes can 
 cause less mining centralization.
 
 Surely the correlation is obvious?
 
 Correlation does not imply causation. I 

Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Hector Chu via bitcoin-dev
On 4 August 2015 at 12:59, Jorge Timón jti...@jtimon.cc wrote:

 That is not my position. Again, I don't know what the right blocksize
 for the short term is (I don't think anybody does).


You have no position (i.e. neutral). In other words, keeping the existing
limit.


 Therefore how the change can affect mining centralization must be the
 main concern, instead of (also artificial) projections about usage
 growth (no matter how organic their curves look).


The degree of mining decentralization is only one of many concerns. Users'
main concern is timely confirmation of low-fee transactions. Miners'
concern is the amount of profit they make.


 Also I don't think hitting the limit must be necessarily harmful and
 if it is, I don't understand why hitting it at 1MB will be more
 harmful than hitting it at 2MB, 8MB or 8GB.


The limit won't even get to be hit, because all the users that get thrown
out of Bitcoin will have moved over to a system supporting a larger block
size.

I don't know where you get your majority from or what it even means
 (majority of users, majority of the coins, of miners?)


The majority which the miners are beholden to is the economic majority.
https://en.bitcoin.it/wiki/Economic_majority


 But there's something I'm missing something there...why my position
 doesn't matter if it's not a majority?


Your position is only one of many and it does not carry excess weight to
the others. Individually it won't matter, because you can't control the
implementation that other people run.


 How is what the the majority has been told it's best an objective argument?


Don't fight the market. The way the system is designed, the miners will
follow along with what the economic majority have decided.

So if you say 8, I must ask, why not 9?
 Why 9 MB is not safe for mining centralization but 8 MB is?


8MB has simply been the focal point for this debate. 9MB is also safe if
8MB is, but I suppose the opponents will be even less happy with 9 than
with 8, and we don't want to unnecessarily increase the conflict.

It seems like the rationale it's always the bigger the better and
 the only limitation is what a few people concerned with mining
 centralization (while they still have time to discuss this) are
 willing to accept. If that's the case, then there won't be effectively
 any limit in the long term and Bitcoin will probably fail in its
 decentralization goals.


A one-time increase to 8MB is safer than a dynamically growing limit over
time for exactly this reason. Admittedly whenever the next debate to
increase the block size over 8MB happens it will be even more painful and
non-obvious, but that is the safety check to prevent unbounded block size
increase.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-04 Thread Jorge Timón via bitcoin-dev
On Tue, Aug 4, 2015 at 3:28 PM, Hector Chu hector...@gmail.com wrote:
 On 4 August 2015 at 14:13, Jorge Timón jti...@jtimon.cc wrote:

 2) It doesn't matter who is to blame about the current centralization:
 the fact remains that the blocksize maximum is the only** consensus
 rule to limit mining centralization.


 Repeating a claim ad-nauseum doesn't make it necessarily true. A block size
 limit won't prevent miners in the future from buying each other out.

But reading it 10 times may help you understand the claim, you will
never find out until you try.
Miners buying each other out is not the only way in which mining
centralization can get even worse.
A Blocksize limit may not be able to prevent such a scenario, but it's
still the only consensus tool to limit mining centralization.
If you want to prove that claim wrong you need to find a
counter-example of another consensus rule that somehow limits mining
centralization.
You could also prove that this rule doesn't help with mining
centralization at all. But that's much more difficult and if you just
claim it (and consequently advocate for the complete removal of the
consensus rule) we will have already advanced a lot.
But you denying that the limits serves limiting mining centralization
and at the same time advocating for keeping a limit at all doesn't
seem very consistent.
If you don't want that rule to limit mining centralization pressures,
what do you want it for?
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-08-02 Thread Anthony Towns via bitcoin-dev
On Thu, Jul 30, 2015 at 12:20:30PM -0400, Gavin Andresen wrote:
 On Thu, Jul 30, 2015 at 10:25 AM, Pieter Wuille via bitcoin-dev
  Some things are not included yet, such as a testnet whose size runs ahead
  of the main chain, and the inclusion of Gavin's more accurate sigop
  checking after the hard fork.
 First, THANK YOU for making a concrete proposal!
 So we'd get to 2MB blocks in the year 2021. I think that is much too
 conservative, and the most likely effect of being that conservative is that
 the main blockchain becomes a settlement network, affordable only for
 large-value transactions.

I haven't seen anyone do the (trivial) maths on this. Have I just
missed it? By my count:

 - blocks in 25 btc in rewards and about 0.5 btc in fees per block
 - at ~$300 USD per btc, that's ~$7,650 per block

 - current hashrate is ~400 PH/s; so ~240,000 PH/block works out
   to having to spend about 30PH per dollar earnt.
 - for comparison,
  https://products.butterflylabs.com/cloud-mining-contracts.html
quotes $1.99 per GH/s for 12 months, which by my count is
  60*60*24*365 / 1.99 GH/$ = 15.8 PH per dollar spent
 - hashrate growth has slowed from about x4/quarter to x2/year:
 sep '13: ~1PH/s
 dec '13: ~4PH/s
 mar '14: ~20PH/s
 jun '14: ~80PH/s
 sep '14: ~200PH/s
 aug '15: ~400PH/s

 - so, as far as I understand it, miners don't make absurd profits compared
   to capital investment and running costs
 - presumably, then, miners will stop mining bitcoin if the revenue/block
   drops significantly at some point
 - less miners means a lower hashrate; a lower hashrate makes
   50% attacks easier, and that's a bad thing (especially if there's lots
   of pre-loved ASIC mining hardware available cheap on ebay or alibaba)

 - in about a year, the block reward halves, cutting out 12.5 btc or
   ~$3750 USD per block. without an increase in fees per block, miners
   will just get ~$3900 USD per block
 - the last time the reward for mining a block was under $4000 per block
   was around oct '13, with a hashrate of ~2PH/s

 - 13 btc in fees per block compared to .5 btc in fees per block is a
   25x increase; which could be either an increase in fee/txn or
   txns/block
 - with ~500 bytes/transaction, that's ~2000 transactions per MB
 - 13 btc in fees ($3900) per block means per transaction fees of
   about
 $2 for 1MB blocks
 $1 for 2MB blocks
 25c for 8MB blocks
 10c for 20MB blocks
   (assuming full blocks, 500 byte txns)

 - comparing that to credit card or paypal fees at ~2.5% that's:
$2 - minimum transaction  $80
$1 - minimum transaction  $40
25c - minimum transaction $10
10c - minimum transaction  $4

 - those numbers only depend on the USD/BTC exchange rate in so far as
   the more USD for a BTC, the more likely the block reward will pay
   for hashrate without transaction fees, even with the reward reduced
   to 12.5 btc/block. otherwise it's just USD/txn paying for USD/hashrate

 - the reference implementation fee of 0.1mBTC/kB equates to about
   3c per transaction (since it rounds up). Even 10c/transaction is more
   than a 3x increase on that.

What the above says to me is that even assuming everyone starts paying
fees, the lightning network works great, and so do sidechains and whatever
else, you /still/ want to up the volume of bitcoin txns by something like
an order of magnitude above what's currently allowed within a year or so.

Cheers,
aj

___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Jorge Timón via bitcoin-dev
On Fri, Jul 31, 2015 at 2:15 PM, Mike Hearn he...@vinumeris.com wrote:
 Hey Jorge,

 He is not saying that. Whatever the reasons for centralization are, it
 is obvious that increasing the size won't help.


 It's not obvious. Quite possibly bigger blocks == more users == more nodes
 and more miners.

How more users or more nodes can bring more miners, or more
importantly, improve mining decentralization?

 To repeat: it's not obvious to me at all that everything wrong with Bitcoin
 can be solved by shrinking blocks. I don't think that's going to suddenly
 make everything magically more decentralised.

I don't think anybody is defending that position so you can stop refuting it.

 The 8mb cap isn't quite arbitrary. It was picked through negotiation with
 different stakeholders, in particular, Chinese miners. But it should be high
 enough to ensure organic growth is not constrained, which is good enough.

Well, negatiations don't make the number less arbitrary. As far as I
know, the sequence of events was this:

1) Gavin proposes 20MB to 20GB
2) Some chinese miners say they can securely handle at most 8 MB.
3) Gavin proposes 8 MB to 8 GB

In any case, history is irrelevant for this point: if party 1 proposes
arbitrary value A and party 2 proposes arbitrary value B, any
compromise value between those 2 is still an arbitrary value. For
example, A + ((B-A)/2) is still an arbitrary value.

I'm sorry, but until there's a simulation that I can run with
different sizes' testchains (for example using #6382) to somehow
compare them, I will consider any value arbitrary. A I run this with
blocksize=X and nothing seems to have broken doesn't help here.
We need to compare, and a criterion to compare.

 I think it would be nice to have some sort of simulation to calculate
 a centralization heuristic for different possible blocksize values
 so we can compare these arbitrary numbers somehow.


 Centralization is not a single floating point value that is controlled by
 block size. It's a multi-faceted and complex problem. You cannot destroy
 Bitcoin through centralization by adjusting a single constant in the source
 code.

Agreed on the first sentence, I'm just saying that the influence of
the blocksize in that function is monotonic: with bigger sizes, equal
or worse mining centralization.
About the second sentence, yes, I could destroy Bitcoin by changing
one single constant if I could somehow force users to adopt my version
of the software. I'm sure I can actually find several examples if
necessary. Through centralization is harder, but say we chose
std::numeric_limitsint64_t::max() as the maximum block size (in
practice, entirely removing the block size limit), then the consensus
rules don't have any rule to limit mining centralization.
Sacrificing that tool, and doing so this early on could certainly turn
Bitcoin into an effectively centralized system, destroying Bitcoin (or
at least the p2p currency part of it, which is the most interesting
one for many Bitcoin users including me).
So, once it's clear that nobody is saying that centralization depends
ONLY on the block size, can you tell us please if you think it's
useful for limiting mining centralization or not?
If you think the blocksize consensus limit does nothing to limit
centralization then there's no tradeoff to consider and you should be
consequently advocating for full removal of the limit rather than
changes towards bigger arbitrary values.
Otherwise you may want to explain why you think 8 GB is enough of a
limit to impede further centralization.

 To say once more: block size won't make much difference to how many
 merchants rely on payment processors because they aren't using them due to
 block processing overheads anyway. So trying to calculate such a formula
 won't work. Ditto for end users on phones, ditto for developers who want
 JSON/REST access to an indexed block chain, or hosted wallet services, or
 miners who want to reduce variance.

Sorry, I don't know about Pieter, but I was mostly talking about
mining centralization, certainly not about payment services.

 What people like you are Pieter are doing is making a single number a kind
 of proxy for all fears and concerns about the trend towards outsourcing in
 the Bitcoin community. Everything gets compressed down to one number you
 feel you can control, whether it is relevant or not.

No, that's not what we are doing.
It's good that you talk about your fears but, please, let other people
talk about theirs on their own.

  So why should anyone go through the massive hassle of setting up
  exchanges,
  without the lure of large future profits?

 Are you suggesting that bitcoin consensus rules should be designed to
 maximize the profits of Bitcoin exchanges?


 That isn't what I said at all Jorge. Let me try again.

 Setting up an exchange is a lot of risky and expensive work. The motivation
 is profit, and profits are higher when there are more users to sell to. This
 is business 101.

I can imagine 

Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Bryan Bishop via bitcoin-dev
On Fri, Jul 31, 2015 at 7:15 AM, Mike Hearn via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:

 He is not saying that. Whatever the reasons for centralization are, it
 is obvious that increasing the size won't help.


 It's not obvious. Quite possibly bigger blocks == more users == more nodes
 and more miners.


Well, even in a centralized scheme you can have more users, more nodes and
more miners. Just having more does not mean that the system isn't
centralized, for example we can point to many centralized services such as
PayPal that have trillions of users.


 To repeat: it's not obvious to me at all that everything wrong with
 Bitcoin can be solved by shrinking blocks. I don't think that's going to
 suddenly make everything magically more decentralised.


Nobody claimed that moving to smaller blocks would solve everything wrong
with Bitcoin.

You cannot destroy Bitcoin through centralization by adjusting a single
 constant in the source code.


Why not? That's exactly the sort of change that would be useful to do so,
in tandem with some forms of campaigning.


 The motivation is profit, and profits are higher when there are more users
 to sell to. This is business 101.


I am confused here; is that idea operating under an assumption (or rule)
like we shouldn't count aggregate transactions as representing multiple
other transactions from other users or something? I have seen this idea in
a few places, and it would be useful to get a fix on where it's coming
from. Does this belief extend to P2SH and multisig...?

- Bryan
http://heybryan.org/
1 512 203 0507
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Mike Hearn via bitcoin-dev
Hey Jorge,

He is not saying that. Whatever the reasons for centralization are, it
 is obvious that increasing the size won't help.


It's not obvious. Quite possibly bigger blocks == more users == more nodes
and more miners.

To repeat: it's not obvious to me at all that everything wrong with Bitcoin
can be solved by shrinking blocks. I don't think that's going to suddenly
make everything magically more decentralised.

The 8mb cap isn't quite arbitrary. It was picked through negotiation with
different stakeholders, in particular, Chinese miners. But it should be
high enough to ensure organic growth is not constrained, which is good
enough.

I think it would be nice to have some sort of simulation to calculate
 a centralization heuristic for different possible blocksize values
 so we can compare these arbitrary numbers somehow.


Centralization is not a single floating point value that is controlled by
block size. It's a multi-faceted and complex problem. You cannot destroy
Bitcoin through centralization by adjusting a single constant in the
source code.

To say once more: block size won't make much difference to how many
merchants rely on payment processors because they aren't using them due to
block processing overheads anyway. So trying to calculate such a formula
won't work. Ditto for end users on phones, ditto for developers who want
JSON/REST access to an indexed block chain, or hosted wallet services, or
miners who want to reduce variance.

None of these factors have anything to do with traffic levels.

What people like you are Pieter are doing is making a single number a kind
of proxy for all fears and concerns about the trend towards outsourcing in
the Bitcoin community. Everything gets compressed down to one number you
feel you can control, whether it is relevant or not.

 So why should anyone go through the massive hassle of setting up
 exchanges,
  without the lure of large future profits?

 Are you suggesting that bitcoin consensus rules should be designed
 to maximize the profits of Bitcoin exchanges?


That isn't what I said at all Jorge. Let me try again.

Setting up an exchange is a lot of risky and expensive work. The motivation
is profit, and profits are higher when there are more users to sell to.
This is business 101.

If you remove the potential for future profit, you remove the motivation to
create the services that we now enjoy and take for granted. Because if you
think Bitcoin can be useful without exchanges then let me tell you, I was
around when there were none. Bitcoin was useless.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Venzen Khaosan via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1



On 07/31/2015 09:58 PM, Mike Hearn via bitcoin-dev wrote:
 How more users or more nodes can bring more miners, or more
 importantly, improve mining decentralization?
 
 
 Because the bigger the ecosystem is the more interest there is in
 taking part?

The logic just flows from one to the other does it? So because Islam
is the biggest religious ecosystem in the world now you and me are
just burning to take part?

By your logic, most people in Asia would horde (or want to pay using)
Chinese Yuan, only they don't. The Yuan and the Yen are the dominant
currencies of large transaction settlement in the region, but try to
use it on the street and you're met with puzzled bemusement, Mike Hearn.

Bitcoin is not suitable as a currency for pervasive dominance, and
ideals of pushing it into every heart and mind around the globe is no
different from religious zealotry.

Bitcoin has its place and we're only at the beginning of a gradual
evolution. How can I say that? Because I'm looking across the rice
paddy to where my neighbors have not adopted the innovation of the
lightbulb, and burn candles for light and cook with gas. And they're
not an anomaly around here or in Asia, Africa and South America.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1

iQEcBAEBAgAGBQJVu5QvAAoJEGwAhlQc8H1m1DQH/i54c+ZBnk9tZK+0PfC2G0rT
taLpqvmXGOHPaSqkfHOLjLOm9LxGAw3TZpFIkFSuSuiSlwDfii2VIlKsbYSCEbBe
twCaZuNqam4r+61755ADrvPziPx3Tr2GXN7Zc635prN9uGoGCu58xxc7Iy8sTsrf
vB430ZN5RhagpFG5LCqN4QmDGQlK+ceYh53jLQ5HpNP/8UsOJjGXdnZfb4V24EFW
0NPAWdmWVFVpEPxqbsmAGjzOPVdocSQuRTekOQHJ7e5XNmHaD3YGHI+hwBDKzcD+
tUuFh7v4C684172PwandfJGAtUJMeavdh+IA21fze3+trrcTOVOZMHr+HEfmWGs=
=AToN
-END PGP SIGNATURE-
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Mike Hearn via bitcoin-dev

 How more users or more nodes can bring more miners, or more importantly,
 improve mining decentralization?


Because the bigger the ecosystem is the more interest there is in taking
part?

I mean, I guess I don't know how to answer your question. When Bitcoin was
new it had almost no users and almost no miners. Now there are millions of
users and factories producing ASICs just for Bitcoin. Surely the
correlation is obvious?

I'm sorry, but until there's a simulation that I can run with different
 sizes' testchains (for example using #6382) to somehow compare them, I will
 consider any value arbitrary.


Gavin did run simulations. 20mb isn't arbitrary, the process behind it was
well documented here:

http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-centralized

*I chose 20MB as a reasonable block size to target because 170 gigabytes
per month comfortably fits into the typical 250-300 gigabytes per month
data cap– so you can run a full node from home on a “pretty good” broadband
plan.*
Did you think 20mb was picked randomly?


 Agreed on the first sentence, I'm just saying that the influence of
 the blocksize in that function is monotonic: with bigger sizes, equal
 or worse mining centralization.


I have a hard time agreeing with this because I've seen Bitcoin go from
blocks that were often empty to blocks that are often full, and in this
time the number of miners and hash power on the network has gone up a huge
amount too.

You can argue that a miner doesn't count if they pool mine. But if a miner
mines on a pool that uses exactly the same software and settings as the
miner would have done anyway, then it makes no difference. Miners can
switch between pools to find one that works the way they like, so whilst
less pooling or more decentralised pools would be nice (e.g.
getblocktemplate), and I've written about how to push it forward before, I
still say there are many more miners than in the past.

If I had to pick between two changes to improve mining decentralisation:

1) Lower block size
2) Finishing, documenting, and making the UX really slick for a
getblocktemplate based decentralised mining pool

then I'd pick (2) in a heartbeat. I think it'd be a lot more effective.


 you should be consequently advocating for full removal of the limit rather
 than changes towards bigger arbitrary values.


I did toy with that idea a while ago. Of course there can not really be no
limit at all because the code assumes blocks fit into RAM/swap, and nodes
would just end up ignoring blocks they couldn't download in time anyway.
There is obviously a physical limit somewhere.

But it is easier to find common ground with others by compromising. Is 8mb
better than no limit? I don't know and I don't care much:  I think Bitcoin
adoption is a slow, hard process and we'll be lucky to increase average
usage 8x over the next couple of years. So if 8mb+ is better for others,
that's OK by me.



 Sorry, I don't know about Pieter, but I was mostly talking about
 mining centralization, certainly not about payment services.


OK. I write these emails for other readers too :) In the past for instance,
developers who run services without running their own nodes has come up.

Re: exchange profit. You can pick some other useful service provider if you
like. Payment processors or cold storage providers or the TREZOR
manufacturers or whoever.

My point is you can't have a tiny high-value-transactions only currency AND
all the useful infrastructure that the Bitcoin community is making. It's a
contradiction. And without the infrastructure bitcoin ceases to be
interesting even to people who are willing to pay huge sums to use it.
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Venzen Khaosan via bitcoin-dev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Mike Hearn, I might be a nobody to you, but you know i talk with
skill,  so let me tell this Friday...


On 07/31/2015 05:16 PM, Mike Hearn via bitcoin-dev wrote:
 I agree with Gavin
You would, of course.

 Bitcoin can support a large scale and it must, for all sorts of 
 reasons. Amongst others:
 
 1. Currencies have network effects. A currency that has few users 
 is simply not competitive with currencies that have many. There's 
 no such thing as a settlement currency for high value transactions 
 only, as evidenced by the ever-dropping importance of gold.

References are imperative if you want to appeal to intelligence in
this list. Studies. Impirical evidence. The above sounds like a a
mainstream precis of how money works - an over-simplistic precis. It's
more complex than that. Besides, all we know for a fact is that
currencies come and go. That's not me being down on Bitcoin - that is
the historical record.

 
 2. A decentralised currency that the vast majority can't use 
 doesn't change the amount of centralisation in the world. Most 
 people will still end up using banks, with all the normal
 problems. You cannot solve a problem by creating a theoretically
 pure solution that's out of reach of ordinary people: just ask
 academic cryptographers!

Conjecture. And assumption. Banks might not accept most people
forever. Or banks' credibility might not survive the next credit
contraction, for example.

 
 3. Growth is a part of the social contract. It always has been.
 
Half the story. Casual observation shows that growth and contraction
alternate at every level of existence. Just because the present
fiat-era credit expansion has lasted 40 years does not mean that the
economy will keep expanding forever.

 The best quote Gregory can find to suggest Satoshi wanted small 
 blocks is a one sentence hypothetical example about what /might/
[snip]

yes, anyway, Greg Maxwell was justified in bringing you down a few
notches from your I am Satoshi's rep on earth history revision, you
were spinning there.


 4. All the plans for some kind of ultra-throttled Bitcoin network 
 used

[snip]

 The network of exchanges, payment processors and startups that are 
 paying people to build infrastructure are all based on _the 
 assumption_ that the market will grow significantly.
The assumption (my emphasis). You've seen that movie Pulp Fiction:
Assume makes an ass of U and me. Business + assumption =
gambling and potential loss of funds.

The ass-U-me cannot be laid at the doorstep of those developers who
prioritize security, decentralization and conservative, tested
progress of scaling.

 So why should anyone go through the massive hassle of setting up 
 exchanges, without the lure of large future profits?
 
Their SWOT analysis includes risks from the banking sector, too. Plus
competition from other exchanges. A sapling 0.x Bitcoin community is
not responsible for nannying anyone's lucrative business model at the
expense of security. How would that make the developers look in
relation to their duty of custody of the protocol? To this and future
generations: Not Good.
 
 5. Bitcoin needs users, lots of them, for its political survival. 
 There are many people out there who would like to see digital cash 
 disappear, or be regulated out of existence.
Nonsense, and again that assumption about how things work you like
to high horse so. Bitcoin's political survival is guaranteed by its
censorship resistance and its array of innovative and unique utility
functions. What's more, the caliber of developer working on Bitcoin is
not just pulled out of a hat and harnessed for an arbitrary altcoin.
Sometimes the fire of moral incentive overrides monetary reward.

The Fed, Nasdaq, IBM, and every other company whose trusted authority
is being threatened by this flagship are developing blockchains in a
hurry. How is that many people out there who would like to see
digital cash disappear? Why would you even type such a blatant falsehood?

 If Bitcoin is a tiny, obscure currency used by drug dealers and a 
 handful of crypto-at-any-cost geeks, the cost of simply banning it 
 outright will seem trivial and the hammer will drop. There won't be
 a large scale payment network OR a high-value settlement network.
 And then the world is really screwed, because nobody will get a
 second chance for a very long time.
 
That is a low estimation of Bitcoin. Your framing does not honor
Bitcoin or the hard work - your _own_ hard work - on this project.

If you noticed, there has been an increase in technical discussion in
this list in recent days - with the goal of comparing and testing the
various blocksize proposals.

Mike Hearn, I am sorry to say that your pronouncements sound like jazz
- - but jazz without rhythm.


So what? - Miles Davis
-BEGIN PGP SIGNATURE-
Version: GnuPG v1

iQEcBAEBAgAGBQJVu199AAoJEGwAhlQc8H1mkCMH/iWnFDXAGc5GEjLi81dRLUnz

Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Jorge Timón via bitcoin-dev
On Fri, Jul 31, 2015 at 12:16 PM, Mike Hearn via bitcoin-dev
bitcoin-dev@lists.linuxfoundation.org wrote:
 Well, centralization of mining is already terrible. I see no reason why we
 should encourage making it worse.

 I see constant assertions that node count, mining centralisation, developers
 not using Bitcoin Core in their own businesses etc is all to do with block
 sizes. But nobody has shown that. Nobody has even laid the groundwork for
 that. Verifying blocks takes milliseconds and downloading them takes seconds
 everywhere except, apparently, China: this resource usage is trivial.

He is not saying that. Whatever the reasons for centralization are, it
is obvious that increasing the size won't help.
In the best case, it will only make it slightly worse. How big of a
slightly worse are we willing to risk to increase the size is the
open question.

 So I see no reason why arbitrarily capping the block size will move the
 needle on these metrics. Trying to arrest the growth of Bitcoin for everyone
 won't suddenly make Bitcoin-Qt a competitive wallet, or make service devs
 migrate away from chain.com, or make merchants stop using BitPay.

As far as I know, people just want to change an arbitrary number for
another arbitrary number.
But this arbitrary cap is a cap to centralization, not a tool to make
Bitcoin-Qt more important or to attack concrete Bitcoin companies like
you seem to think.
If you don't think the blocksize cap helps limiting centralization and
you think it would be fine to completely remove it, then it would be
better for the conversation that you said that directly instead of
supporting other arbitrary caps like 8GB (bip101).

I think it would be nice to have some sort of simulation to calculate
a centralization heuristic for different possible blocksize values
so we can compare these arbitrary numbers somehow. Even if the first
definition of this centralization heuristic is stupid, it would be
better than keep rolling dices and heatedly defend one result over
another.
To reiterate, If you don't think the blocksize cap helps limiting
centralization, please, say so.
If we can't agree on what the limit is for, we will never be able to
agree on whether 1MB (current situation) or 8GB (bip101) is the most
appropriate value to have at a given point in time.

 We need to accept that, and all previous proposals I've seen don't seem to
 do that.

 I think that's a bit unfair: BIP 101 keeps a cap. Even with 8mb+growth
 you're right, some use cases will be priced out. I initiated the
 micropayment channels project (along with Matt, tip of the hat) specifically
 to optimise a certain class of transactions. Even with 8mb+ blocks, there
 will still be a need for micropayment channels, centralised exchange
 platforms and other forms of off chain transaction.

Lightning is nothing more than a better design for trustless payment
channels, but it's really good that you agree that if we want to scale
not everything can be broadcast in-chain.

 If Bitcoin needs to support a large scale, it already failed.

 It hasn't even been tried.

What he means is that if Bitcoin needs to support a scale that is only
feasible with high degrees of centralization (say, supporting 1 M tx/s
right now), then it has already failed in its decentralization goals.
In fact, with only a few miners, I'm not sure regulators will still
agree Bitcoin transactions are irreversible...
But you are right, we haven't tried to destroy bitcoin by removing the
only available consensus tool to limit centralization yet.
I don't want to try, do you?

 A decentralised currency that the vast majority can't use doesn't change the
 amount of centralisation in the world. Most people will still end up using
 banks, with all the normal problems. You cannot solve a problem by creating
 a theoretically pure solution that's out of reach of ordinary people: just
 ask academic cryptographers!

Let's go to most people use bitcoin first and then think about many
people ONLY use Bitcoin later, please.
I believe everybody here thinks that the more people are able to use
Bitcoin, the better.
But that doesn't

 All the plans for some kind of ultra-throttled Bitcoin network used for
 infrequent transactions neglect to ask where the infrastructure for that
 will come from. The network of exchanges, payment processors and startups
 that are paying people to build infrastructure are all based on the
 assumption that the market will grow significantly. It's a gamble at best
 because Bitcoin's success is not guaranteed, but if the block chain cannot
 grow it's a gamble that is guaranteed to be lost.

Risking destroying Bitcoin through centralization to be able to keep
free transactions for longer it's a very risky gamble.
Doing so explicitly against the will of some of the users by promoting
schism hardfork, and thus risking to economically destroy both Bitcoin
and Bitcoin_new_size (different currencies) in the process is also a
very risky gamble.
So may want to give 

Re: [bitcoin-dev] Block size following technological growth

2015-07-31 Thread Mike Hearn via bitcoin-dev
I agree with Gavin - whilst it's great that a Blockstream employee has
finally made a realistic proposal (i.e. not let's all use Lightning) -
this BIP is virtually the same as keeping the 1mb cap.

 Well, centralization of mining is already terrible. I see no reason why we
 should encourage making it worse.

Centralization of mining has been a continual gripe since Slush first
invented pooled mining. There has never been a time after that when people
weren't talking about the centralisation of mining, and back then blocks
were ~10kb.

I see constant assertions that node count, mining centralisation,
developers not using Bitcoin Core in their own businesses etc is all to do
with block sizes. But nobody has shown that. Nobody has even laid the
groundwork for that. Verifying blocks takes milliseconds and downloading
them takes seconds everywhere except, apparently, China: this resource
usage is trivial.

Yet developers, miners and users even outside of China routinely delegate
validation to others. Often for quite understandable technical reasons that
have nothing to do with block sizes.

So I see no reason why arbitrarily capping the block size will move the
needle on these metrics. Trying to arrest the growth of Bitcoin for
everyone won't suddenly make Bitcoin-Qt a competitive wallet, or make
service devs migrate away from chain.com, or make merchants stop using
BitPay.

We need to accept that, and all previous proposals I've seen don't seem to
 do that.

I think that's a bit unfair: BIP 101 keeps a cap. Even with 8mb+growth
you're right, some use cases will be priced out. I initiated the
micropayment channels project (along with Matt, tip of the hat)
specifically to optimise a certain class of transactions. Even with 8mb+
blocks, there will still be a need for micropayment channels, centralised
exchange platforms and other forms of off chain transaction.

If Bitcoin needs to support a large scale, it already failed.

It hasn't even been tried.

The desperately sad thing about all of this is that there's going to be a
fork, and yet I think most of us agree on most things.  But we don't agree
on this.

Bitcoin can support a large scale and it must, for all sorts of reasons.
Amongst others:

   1. Currencies have network effects. A currency that has few users is
   simply not competitive with currencies that have many. There's no such
   thing as a settlement currency for high value transactions only, as
   evidenced by the ever-dropping importance of gold.


   2. A decentralised currency that the vast majority can't use doesn't
   change the amount of centralisation in the world. Most people will still
   end up using banks, with all the normal problems. You cannot solve a
   problem by creating a theoretically pure solution that's out of reach of
   ordinary people: just ask academic cryptographers!


   3. Growth is a part of the social contract. It always has been.

   The best quote Gregory can find to suggest Satoshi wanted small blocks
   is a one sentence hypothetical example about what *might* happen if
   Bitcoin users became tyrannical as a result of non-financial transactions
   being stuffed in the block chain. That position makes sense because his
   scaling arguments assuming payment-network-sized traffic and throwing DNS
   systems or whatever into the mix could invalidate those arguments, in the
   absence of merged mining. But Satoshi did invent merged mining, and so
   there's no need for Bitcoin users to get tyrannical: his original
   arguments still hold.


   4. All the plans for some kind of ultra-throttled Bitcoin network used
   for infrequent transactions neglect to ask where the infrastructure for
   that will come from. The network of exchanges, payment processors and
   startups that are paying people to build infrastructure are all based on
   the assumption that the market will grow significantly. It's a gamble at
   best because Bitcoin's success is not guaranteed, but if the block chain
   cannot grow it's a gamble that is guaranteed to be lost.

   So why should anyone go through the massive hassle of setting up
   exchanges, without the lure of large future profits?


   5. Bitcoin needs users, lots of them, for its political survival. There
   are many people out there who would like to see digital cash disappear, or
   be regulated out of existence. They will argue for that in front of
   governments and courts  some already are. And if they're going to lose
   those arguments, the political and economic damage of getting rid of
   Bitcoin must be large enough to make people think twice. That means it
   needs supporters, it needs innovative services, it needs companies, and it
   needs legal users making legal payments: as many of them as possible.

   If Bitcoin is a tiny, obscure currency used by drug dealers and a
   handful of crypto-at-any-cost geeks, the cost of simply banning it outright
   will seem trivial and the hammer will drop. There won't be a large 

[bitcoin-dev] Block size following technological growth

2015-07-30 Thread Pieter Wuille via bitcoin-dev
Hello all,

here is a proposal for long-term scalability I've been working on:
https://gist.github.com/sipa/c65665fc360ca7a176a6

Some things are not included yet, such as a testnet whose size runs ahead
of the main chain, and the inclusion of Gavin's more accurate sigop
checking after the hard fork.

Comments?

-- 
Pieter
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-30 Thread Jorge Timón via bitcoin-dev
1) Unlike previous blocksize hardfork proposals, this uses median time
instead of block.nTime for activation. I like that more but my
preference is still using height for everything. But that discussion
is not specific to this proposal, so it's better if we discuss that
for all of them here:
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009731.html

2) I think uncontroversial hardforks should also take miner
confirmation into account, just like uncontroversial softforks do. We
cannot make sure other users have upgraded before activating the
chain, but we can know whether miners have upgraded or not. Having
that tool available, why not use it. Of course other hardforks may not
care about miners' upgrade state. For example anti-miner hardforks,
see 
https://github.com/jtimon/bips/blob/bip-forks/bip-forks.org#asic-reset-hardfork
But again, this is common to all uncontroversial hardforks, so it
would probably better to discussed it in
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-June/008936.html
(gmaxwell assigned to bip99 to my bip draft).

3) As commented to you privately, I don't like to make any assumptions
about technological advancements (much less on economical growth). I
don't expect many people to agree with me here (I guess I've seen too
many peak oil [or more generally, peak energy production] plus I've
read Nietzsche's On the utility and liability of history for life
[1]; so considering morals, technology or economics as monotonic
functions in history is simply a ridiculous notion to me), but it's
undeniable that internet connections have improved overall around the
world in the last 6 years. I think we should wait for the
technological improvements to happen and then adapt the blocksize
accordingly. I know, that's not a definitive solution, we will need
to change it from time to time and this is somewhat ugly.
But even if I'm the only one that considers a technological
de-growth possible, I don't think is wise to rely on pseudo-laws like
Moore's or Nielsen’s so-called laws.
Stealing a quote from another thread:

Prediction is difficult, especially about the future. - Niels Bohr

So I would prefer a more limited solution like bip102 (even though I
would prefer to have some simulations leading to  a concrete value
(even if it's bigger) rather than using 2MB's arbitrary number.

Those are my 3 cents.

[1] https://philohist.files.wordpress.com/2008/01/nietzsche-uses-history.pdf

On Thu, Jul 30, 2015 at 4:25 PM, Pieter Wuille via bitcoin-dev
bitcoin-dev@lists.linuxfoundation.org wrote:
 Hello all,

 here is a proposal for long-term scalability I've been working on:
 https://gist.github.com/sipa/c65665fc360ca7a176a6

 Some things are not included yet, such as a testnet whose size runs ahead of
 the main chain, and the inclusion of Gavin's more accurate sigop checking
 after the hard fork.

 Comments?

 --
 Pieter


 ___
 bitcoin-dev mailing list
 bitcoin-dev@lists.linuxfoundation.org
 https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev

___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-30 Thread Jameson Lopp via bitcoin-dev
I fully expect that new layers will someday allow us to facilitate higher
transaction volumes, though I'm concerned about the current state of the
network and the fact that there are no concrete timelines for the rollout
of aforementioned high volume networks.

As for reasoning behind why users will still need to settle on-chain even
with the existence of high volume networks, see these posts:

http://sourceforge.net/p/bitcoin/mailman/message/34119233/

http://sourceforge.net/p/bitcoin/mailman/message/34113067/

Point being, the scalability proposals that are currently being developed
are not magic bullets and still require the occasional on-chain settlement.
Larger blocks will be necessary with or without the actual scalability
enhancements.

- Jameson

On Thu, Jul 30, 2015 at 12:36 PM, Bryan Bishop kanz...@gmail.com wrote:

 On Thu, Jul 30, 2015 at 11:23 AM, Jameson Lopp via bitcoin-dev 
 bitcoin-dev@lists.linuxfoundation.org wrote:

 Stated differently, if the cost or contention of using the network rises
 to the point of excluding the average user from making transactions, then
 they probably aren't going to care that they can run a node at trivial cost.


 That's an interesting claim; so suppose you're living in a future where
 transactions are summarizing millions or billions of other daily
 transactions, possibly with merkle hashes. You think that because a user
 can't individually broadcast his own personal transaction, that the user
 would not be interested in verifying the presence of a summarizing
 transaction in the blockchain? I'm just curious if you could elaborate on
 this effect. Why would I need to see my individual transactions on the
 network, but not see aggregate transactions that include my own?

 - Bryan
 http://heybryan.org/
 1 512 203 0507

___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev


Re: [bitcoin-dev] Block size following technological growth

2015-07-30 Thread Pieter Wuille via bitcoin-dev
On Jul 30, 2015 6:20 PM, Gavin Andresen gavinandre...@gmail.com wrote:

 On Thu, Jul 30, 2015 at 10:25 AM, Pieter Wuille via bitcoin-dev 
bitcoin-dev@lists.linuxfoundation.org wrote:

 Some things are not included yet, such as a testnet whose size runs
ahead of the main chain, and the inclusion of Gavin's more accurate sigop
checking after the hard fork.

 Comments?


 First, THANK YOU for making a concrete proposal!

You're welcome.

 Specific comments:

 So we'd get to 2MB blocks in the year 2021. I think that is much too
conservative, and the most likely effect of being that conservative is that
the main blockchain becomes a settlement network, affordable only for
large-value transactions.

If there is demand for high-value settlements in Bitcoin, and this results
in a functioning economy where fees support the security of a transparent
network, I think that would be a much better outcome than what we have now.

 I don't think your proposal strikes the right balance between
centralization of payments (a future where only people running payment
hubs, big merchants, exchanges, and wallet providers settle on the
blockchain) and centralization of mining.

Well, centralization of mining is already terrible. I see no reason why we
should encourage making it worse. On the other hand, sure, not every
transaction fits on the blockchain. That is already the case, and will
remain the case with 2 MB or 8 MB or 100 MB blocks. Some use cases fit, and
others won't. We need to accept that, and all previous proposals I've seen
don't seem to do that.

Maybe the only ones that will fit are large settlements between layer-2
services, and maybe it will be something entirely different. But at least
we don't need to compromise the security of the main layer, or promise the
ecosystem unrealistic growth of space for on-chain transactions.

If Bitcoin needs to support a large scale, it already failed.

 I'll comment on using median time generally in Jorge's thread, but why
does monotonically increasing matter for max block size? I can't think of a
reason why a max block size of X bytes in block N followed by a max size of
X-something bytes in block N+1 would cause any problems.

I don't think it matters much, but it offers slightly easier transition for
the mempool (something Jorge convinced me of), and validation can benefit
from a single variable that can be set in a chain to indicate a switchover
happened, without needing to recalculate it every time.

I assume you're asking this because using raw nTime means you can check
what limits a p2p message should obey to by looking at just the first
bytes. I don't think this matters: your p2p protocol should deal with
whatever limits the system requires anyway. An attacker can take a valid
message of far in the chain, change a few bytes, and spam you with it at
zero extra effort anyway.

-- 
Pieter
___
bitcoin-dev mailing list
bitcoin-dev@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev