Many in the Bitcoin community want a distributed network, rather than a decentralized one. The scalability crisis, also known as “Bitcoin’s civil war” stems from this philosophical difference
Now that SegWit has locked-in, Bitcoin’s three-year civil war appears to be nearing an end. While we aren’t completely out of the woods yet – questions remain about the pending blocksize increase this fall, and about the long-term viability of Bitcoin Cash – the conflict does appear to be nearly resolved.
The next step is SegWit activation, which will come in about two weeks after the Bitcoin mining difficulty retargets again. Once SegWit is activated, more transactions will be able to fit into each 1 MB block, and congestion on the network will drop. Further enhancements such as the lightning network will be able to move forward, perhaps revolutionizing the digital currency.
Back to the beginning
From what is hopefully the end, let’s look back to the beginning. It’s easy to get caught up in the politics and espouse your side’s point-of-view, but it’s much harder to look back objectively and seek the actual causes of the infamous “blocksize debate.”
By early 2014, it was obvious that there would have to be changes to Bitcoin’s code in order to allow the network to scale. The Bitcoin network was only able to process about four transactions per second, and interest in Bitcoin was growing. While there was not yet a transaction backlog, it was clear that unless action was taken, there eventually would be.
Big blockers vs. Core
Roughly speaking, the community divided into two camps: those who supported larger blocks which could in turn hold more transactions, and those who favored a more technically nuanced solution. In the end, these sides came to be known as “the big blockers” and the “SegWit” or “Core” supporters.
Satoshi’s vision
When Satoshi originally created Bitcoin, he foresaw a network in which everybody would run a node. Each node holds the entire Blockchain (and serves it to peers), ensuring that the Blockchain is immutable and cannot be controlled by a single entity. At the time, the Blockchain was very small, and running a node was nearly free.
Over time, as Bitcoin’s popularity grew, so too did the size of the Blockchain. As the Bitcoin Blockchain grew, running a full node required a computer with a large hard drive and plenty of internet bandwidth. As both storage space and bandwidth are expensive, a number of volunteer nodes began dropping off the network.
I like big blocks…
When met with scalability challenges in 2014, the big blockers’ solution was simple: increase the blocksize. If each block can hold more data, then the network can process more transactions per second. The downside is that the bigger the blocks, the bigger the Blockchain.
A bigger Blockchain leads to fewer nodes (since it’s more expensive to store and serve that amount of data). Fewer nodes leads to less decentralization. The big blockers insisted that this was fine, because enough volunteers would still run nodes that Bitcoin could remain decentralized.
Developers’ opposition
Most of Bitcoin’s Core developers lined up behind a different solution: Segregated Witness. This solution would allow transaction data and signatures to be split. The signatures could be “compressed” and more transactions could fit into normal-sized (1 MB) blocks. Basically, Core was advocating for a sophisticated kind of data compression.
Such compression would keep the Blockchain smaller, allowing more nodes to host it. There were some problems with this, however. In the first case, SegWit is quite complicated and some implementation challenges were expected. Nobody was quite sure, for instance, whether it would require a soft fork or a hard fork.
In the second case, most of the Core developers supporting SegWit were themselves employed by a private company called Blockstream. Many in the community saw this as a power-grab by a private company trying to exercise undue influence over a decentralized project.
Decentralized or distributed?
Despite the numerous side-issues, the crux of the scalability debate boiled down to the question:
“How much decentralization is enough?”
The problem is that nobody knows the answer to that question. Decentralization has become such a powerful mantra in the cryptocurrency community that it seems the concept gets a bit muddled. A decentralized network is one that is not controlled by any single entity or group of entities and is resistant from attack by the same.
Philosophically speaking, tens- or even hundreds of thousands of nodes would be ideal, but that isn’t really practical. Such a network would be distributed, not decentralized. A distributed network is one that is massively decentralized, where virtually everybody runs a node. Given the costs of running a node, it’s unlikely that Bitcoin will ever become truly distributed.
At the same time, a project with a few dozen nodes can’t rightfully be called decentralized and attack-resistant, either. Right now there are just over 9200 nodes serving the Bitcoin network. Is that enough to keep the project truly decentralized? Would 4000 nodes be good enough? Would 1000?
Before moving forward with discussions about other scalability enhancements such as lightning network, the community needs to reach consensus on a basic but fundamental question:
Just how much decentralization is enough?