Bitcoin block size debate exposing problem with requiring too much consensus

The debate whether or not to increase bitcoin’s block size exposes how difficult it is to make changes to p2p protocols like bitcoin. Personally, I think the block size needs to be increased as recommended and pushed by Gavin Andreson. So it boggles my mind on how difficult it is to reach consensus on what seems like a no-brainer decision.

Regardless of my stance on the matter, it shows how reaching consensus becomes harder and harder as a p2p protocol like bitcoin becomes bigger and bigger and more established. Safecoin will avoid this problem because there is no blockchain and uses a very different protocol/solution.

But when safecoin and the SAFE network gets bigger and bigger, there will be no doubt changes will need to made to the protocol as it matures. So how will future changes be implemented if a consensus cannot be met? Will it only take accidents and bugs to force a change to the SAFE protocol? Will SAFE forks be required so competition will decide which version to use?

4 Likes

I don’t see how this could be done in practice, unless the free capacity of the network is well above twice the used capacity of the network.

I think I made two pretty good posts in the past as to why increasing the blockchain size is a problem:

The biggest problem with increasing the blockchain size is that it further decreases the amount of people that can actually run a full node, and if less and less people can fun a full node, bitcoin no longer becomes decentralized, because it will only be a few people with very large servers who can run bitcoin. Don’t believe me? Check out this chart showing the number of full bitcoin nodes over the past year. The number of full nodes has decreased from 7500 to only 6000 nodes. There is even an incentive program running to help get people running a full bitcoin node! But sadly, even this hasn’t made the number of full nodes increase. I even set up a bitcoin full node on my computer, and it took a full 24 hours to download (and process)… last I checked the blockchain size was 40 GiB, that’s a lot to download.

tl;dr: The blockchain size will increase to over 1TB if they increase the transaction limit; yes, decentralized applications are hard to update; the number of full bitcoin nodes are decreasing.

2 Likes

OP only used that as an example prompt for the more interesting question of how SAFE evolves over time; I don’t know, if it is obvious yet whether that will be via authority or consensus or [insert 3rd option here].

1 Like

I’m not sure how useful this is here but any safe pod or Maidsafe foundation could push an update as long as it ranks better than previous versions. David Irvine has an update sceme where a sacrificial vault is pushed to the network and upon better than current ranking, will be accepted and can then be adopted by all other nodes (I’m assuming based on users approval) but are still a part of the same network I’d also assume. I’m not sure if anything with this has changed but it seems intuitive to me and imagine that it still stands being that the network still ranks behavior.

1 Like

Automated quality assurance of updates through vault ranking is very cool, but ranking doesn’t account for all features. For example, a change in the algorithm that determines upload costs has no influence on rank, but could be very contested.

2 Likes

That’s probably why they want to get it right the first time and do the test nets before official safecoin launch :smile: thinking about that more would that mean such an algorithm could be updated at any time or only packaged with an improvement that is affected by rank? And could these algorithms be updated maliciously or simply not be updated at all hence the pressure to get the farming rate right the first time. I would assume the latter myself.

Problem is that test nets are never the same as reality, plus reality constantly evolves.

What I’m mostly worried about is an update that is adopted by half the nodes in the network, and that one of those halves starts to disconnect and de-rank the others for following different rules. Or just as annoying, close groups not achieving the consensus quorum and thus not taking action.

2 Likes

Excellent thoughts to ponder @Seneca and I see your point about reality always changing. It is a closed Eco system though and the number of nodes and resources provided should be the determining factor in safecoin price so I think there’s a good chance if set up correctly a nice cycle should persist. But a split in versions that’s interesting. I believe I remember reading that now all the new types that versions can be rolled back. Is that only applicable to data stored or even actuall network versions? Because if so then I imagine in such a scenario that a rollback in network version could be put in place or perhaps a forced update. Perhaps even a democratic vote by the users could choose which version

Just to note this again is an argument for why the base, needs to be as simple as possible. Everything else can be applications.

1 Like

Just some thoughts…

Different core rules of the network could differ in their consensus mechanism. Some need to be very hard, definitely requiring the 28 of 32 consensus quorum, others can perhaps be a compromise in case of disagreement. To continue with the example of upload costs, it could be the mean, median or mean of the second and third quartile to eliminate outliers. That way there is never a consensus crisis just because of a slight disagreement in upload costs.

Another thought:

Maybe we can figure out a way to allow the network to agree to disagree on certain matters, not creating a full split of the network in case of disagreement, but only on parts of it. For an incomplete and possibly impossible example, if there’s disagreement in SafeCoin rewards, the two networks could split up in terms of consensus groups and SafeCoin tag_type, yet still share the same data managers and vaults. This way neither network suffers a capacity crisis and over time one network will likely obliterate the other, effectively invalidating that network’s SafeCoins. The important thing is that GET/PUT/POST commands are still routed and handled among the two networks.

1 Like

I think it would be safe to say as long as David Irvine is leading the SAFE team, changes to the protocol could be agreed to fairly quickly and decisively. But heaven forbid there is a situation where David is not around as founder and benevolent leader. We could be in the same situation as bitcoin. I’m sure if Satoshi popped-up his head and gave his recommendation on how to resolve the blocksize debate, it would be followed with no questions asked.

In the alt-coin universe, there’s a big assumption for the majority of the coins that the founder and original dev team will always be around to implement changes and updates. If we expect the SAFE network to exist for the next few decades, we can’t expect this assumption to hold – all I have to do is look how the Bitcoin Foundation is functioning at the moment. So how will changes be made? In an ideal world, it would be nice if all future issues could be thought out and resolved before initial rollout. As a software engineer, I know this is next to impossible for any software of major significance. Events and the environment change as mentioned in the above posts.

Other bitcoin 2.0 initiatives like Bitshares, NXT, etc. have a voting mechanism to handle the above scenario. Some of the above suggestions are a good starting point. I think it’s vitally important that a change management process be established before large SAFE adoption.

On a side note, I’m curious how bitcoin sidechains will be received in the future if the blocksize debate is any indication. It’s an interesting solution on future-proofing bitcoin without tainting the main protocol code. It will require a fork. I hope Blockstream has thought through on how to convince the mining pools to accept the change and ensure this will be the fork to end all forks.

2 Likes