Why blocksize doesn’t solve scaling (and never will)

Dani J.
4 min readOct 30, 2017

Recently I read some lowbrow tabloid article about some BTU researchers creating a Bitcoin testnet with a 1 GB (gigabyte!) blocksize limit, and inanely claiming that this “will eventually solve transaction scaling, and provide more capacity than VISA or PayPal”.

This is just plain stupid. It won’t. Ever. There’s a reason VISA throughput is capped where it is. And blockchain technology is by its very nature a lot less efficient than VISA. (Nb. that doesn’t mean worse. The human brain is less efficient than computers at many tasks, but we still consider it superior…)

What is scaling

So about scaling… There are two kinds of scaling, “scaling up” and “scaling out”. When we talk about “scaling”, we pretty much always mean scaling out. Scaling out doesn’t simply mean “it can take a high load”, it means, translated roughly into practical terms, “I can increase the load it can take by stupidly throwing money at it”.

The point here is “stupidly”, which means I don’t need to think, just tell my IT department or cloud service provider to “shut up and take my money”, and magically the service throughput goes up two-, five-, ten-, or even a hundredfold.

For this to happen, the service needs to be parallelized, so that adding new, small servers will be able to take an appropriate fraction of the load, and thus multiply throughput.

Why Blockchain doesn’t scale (out)

Blockchain implementations, at the moment, are not parallelized. In fact, its very design means that it is strictly serial in its nature. All nodes perform the exact same workload.

It’s a lot like how monks translated the Bible into Greek during early Christianity (known as the Septuagint, literally meaning “seventy”). Seventy monks translated the *whole Bible each*, and sections that weren’t exactly the same in all seventy versions were thrown out and re-translated.

This way, seventy monks didn’t translate the Bible seventy times as fast as a single monk would have. It actually took *longer*, because of having to resolve each of the mismatches! Of course, there is a lot of value in having seventy monks translating the Bible, or thousands of nodes running the Bitcoin network. It gives us reliability and security!

Even so, blockchain technology doesn’t scale out. More nodes doesn’t mean more throughput. There is research into how a blockchain could be split into multiple smaller sidechained blockchains (“sharded”), while maintaining relatively similar level of security. Ethereum is putting in a lot of effort into this research, but I doubt Bitcoin will ever take this path — it’s just too foreign from its core ideals…

Scaling the blockchain up

So the big blocker crowd comes in, claiming that by increasing block size, they can “scale” the blockchain. Of course, they mean “scale up”, which is a bit dishonest. Scaling up means “I need to use larger, more powerful, more complex hardware” (as opposed to “add more of the same, low-spec hardware” in the case of scaling out).

In the case of blockchain nodes, every single node of the blockchain has to be upgraded in order for the chain to scale up. Not as easy as adding more nodes.

However, there are also resources outside the blockchain that need to be taken into account. The Internet itself, for example. I have serious doubts that the Internet could actually run a Bitcoin network with a multi-gig blocksize… Especially Internet providers in areas that need cryptocurrency the most (ie. war zones, third world countries, etc.) Also, abusing semi-free public services tends to attract unwanted attention and public outrage. As if cryptocurrencies weren’t being demonized enough as we are.

BTU claiming that “gigabyte blocks will solve transaction throughput” is like, well, something like the following:

Elon Dusk, well-known mad scientist and kajillionaire has recently stunned public opinion by providing the final solution for intercontinental travel. His company has built a prototype intercity train that measures thirty-five hundred miles in length, making movement-less travel between London and New York a reality. The train can be parked in both cities at the same time, requiring no movement, and thus no fuel consumption whatsoever. Travellers can just board the train in New York, and get off in London…

Yea, right? When someone mentions one (or ten, or hundred) Gigabyte blocksizes solving anything, just remember this. Sure I can build a multi-gigabyte blocksize chain. What will it run on? What servers, what network?

The actual solution

It’s off-chain microtransactions. The blockchain is like a notary service. You don’t notarize every scrap paper you jot down, only important things. Similarly, it’s completely possible to build mechanisms that provide safe microtransactions that can be entered into the blockchain in packages after a while (as a settlement), for a single transaction’s on-chain cost.

This is what the Lightning network does on Bitcoin, or Raiden on Ethereum. There are other similar approaches being developed and tested as well.

Note that these do not make such transactions centralized. They are just as decentralized as regular on-chain transactions, but carry a slightly lower level of reliability in comparison (however, they are still quite reliable, and this drop in reliability is okay, since we’re talking about relatively low-value transactions anyway).

The only goal that megablocks can serve is specifically to centralize blockchains in the hands of malicious corporate and state entities. Please don’t swallow the hype. ;)

--

--

Dani J.

There’s two kinds of programming: functional and dysfunctional.