Only headers need to be downloaded sequentially so downloading relevant blocks
from one node is totally possible with gaps in between.
On 2/27/21 4:10 AM, Igor Cota via bitcoin-dev wrote:
> Hi Keagan,
>
> I had a very similar idea. The only difference being for the node to decide on
> a range of b
On 8/6/19 10:27 PM, Chris Belcher via bitcoin-dev wrote:
> I think this is absolutely wrong, because sybil attackers give up some
> fee income. Here is a worked example:
>
> Let's say the sybil attacker is operating the top 5 most valuable maker
> bots. If this attacker has X coins they would split
On 8/6/19 7:04 AM, Chris Belcher via bitcoin-dev wrote:
> However, there _is_ a cost to being a sybil attacker. If we define
> honest makers as entities who run just one maker bot, and dishonest
> makers as entities who run multiple maker bots, then we can say that
> running a dishonest maker opera
On 01/04/2017 12:06 AM, adiabat via bitcoin-dev wrote:
> Also, if you're running a light client, and storing the filters the way you
> store block headers, there's really no reason to go all the way back to height
> 0. You can start grabbing headers at some point a while ago, before your set
> of
gmaxwell just made me aware of this mail thread [0]. Some days ago I had
independently and naively started implementing "something similar" [1].
My version totally ignored the commitment and signing part but I'm pretty sure
that 12GB is overkill. My code is currently broken and I have no time to w
On 08/10/2015 05:39 AM, Thomas Zander via bitcoin-dev wrote:
> On Monday 10. August 2015 07.57.30 Rune K. Svendsen via bitcoin-dev wrote:
>> What Lightning does is raise the value of a transaction on the block chain.
>> Imagine you're a Lightning node, and in order to collect your fees, that
>> you
Thank you a lot for doing this test!
Two questions:
1) A node is typically connected to many nodes that would all in parallel
download said block. In your test you measured how fast new blocks that
presumably are being uploaded in parallel to all those other nodes are being
uploaded? Or did you d