Why Crypto is Stuck in 2014
Every innovation in computing infrastructure arose for the purpose of adding capabilities to what preceded it.
The sequence of invention is not arbitrary, it is dimensional. Each breakthrough adds exactly one dimension that the dimensions below could not provide. A horizontal axis cannot measure vertically. Thus, the vertical axis is not an improvement of the horizontal, it enables every coordinate on the X axis to go places it never knew existed.
Architects define dimensional frameworks, engineers make them matter.
Dimensional architecture theory provides an intuitive scaffolding for identifying first principle problems in any system. Here, we will trace a brief history and hierarchy of information systems architecture and identify the error that has constrained the blockchain industry for over a decade.
Dimension 1: Computation
Computation is the bedrock. Every subsequent dimension depends on it, and none can replace it. By the time any later dimension emerged, computation was already decades old, already scaling on its own trajectory, already abundant. This fact will become important.
Dimension 2: Networking
Networking added a dimension that computation alone could not provide: the ability to transmit data between physically separated machines. It did not attempt to redefine what computation was. It sat on top of it, thin and specific.
Dimension 3: Cryptography
Cryptography added identity, authentication, integrity, and proof of properties to the stack. A message could now be proven to have come from a specific sender and to have arrived unaltered. Data could be proven to exist within a committed structure. But cryptography alone could not solve the problem of agreement. Two parties who did not trust each other could verify a message, but they could not agree on the order of events without a trusted intermediary.
Dimension 4: Canonical Order
On January 3, 2009, the genesis block was mined. Bitcoin solved the problem of ordering events among mutually distrusting participants without any central authority.
Computation existed. Networking existed. Cryptographic identity and integrity existed. What did not exist, in any practical form, was trustless order.
Nakamoto fused proof-of-work, hash chains, economic incentives, and peer-to-peer networking into a single mechanism that produced something none of its components could produce alone: a global, tamper-evident, append-only sequence of events that anyone could verify and no one could unilaterally alter.
The breakthrough was narrow, and specific. Bitcoin added one new dimension to the stack. Satoshi did not declare victory, they defined a dimension. They also pointed the way forward, but hardly anyone noticed.
It is possible to verify payments without running a full network node. A user only needs to keep a copy of the block headers of the longest proof-of-work chain, which he can get by querying network nodes until he’s convinced he has the longest chain, and obtain the Merkle branch linking the transaction to the block it’s timestamped in. He can’t check the transaction for himself, but by linking it to a place in the chain, he can see that a network node has accepted it, and blocks added after it further confirm the network has accepted it. (Nakamoto, 2008)
The Detour
In 2013, Vitalik Buterin proposed Ethereum as a “next-generation smart contract and decentralized application platform,” describing what would come to be known as a “world computer” [2]. The Ethereum Virtual Machine placed arbitrary computation at the consensus layer. Every node in the network would execute every program. Every state transition, no matter how trivial, would pass through the global ordering bottleneck.
This was not the addition of a new dimension. It was the collapse of an existing one. Ethereum took Dimension 1: computation, which had been solved in 1936, which had been scaling independently for seventy years, which was the most abundant resource in the entire stack, and engineered it into the scarce space of canonical ordering.
The decision took the ordering dimension, the genuinely novel and scarce dimension, and demanded that it also serve as a world computer.
The result is a structure where every component is constrained by the load-bearing limitations of the narrowest element: ordering.
Computation, which was abundant, became artificially scarce. Order, which was precious, became burdened with work it was never designed to carry.
The Dimensional Framework
The hierarchy that history reveals is a dependency graph defined by the order of invention:
Dimension 1 — Computation (1936): State transitions. The ability to take an input and produce a deterministic output.
Dimension 2 — Networking (1969–1974): Communication. The ability to transmit data between machines.
Dimension 3 — Cryptography (1976–1979): Identity, integrity, authentication, and proof. The ability to prove who sent what, and that data belongs to a committed structure, without a trusted intermediary.
Dimension 4 — Canonical Order (2008–2009): Canonical sequencing. The ability to establish the order of events among adversarial participants without central authority.
Each dimension is a real thing that was actually built. Each emerged because the dimensions below it existed but could not provide the new one.
The principle this history enforces is architectural: each dimension should be orthogonal to those that precede it, and should not conflate with what they already provide.
The history of computing built a design space. The task of architecture is not to fill these dimensions with complexity, but to define the coordinate system and leave the volume unfilled.
This is the architect’s gift to every builder who follows.
Continue to Part 2: Why Crypto is Stuck in 2014 - Pt 2
References
[1] S. Nakamoto, “Bitcoin: A Peer-to-Peer Electronic Cash System,” 2008. Available at https://bitcoin.org/bitcoin.pdf .
[2] V. Buterin, “Ethereum: A Next-Generation Smart Contract and Decentralized Application Platform,” 2013. Available at https://ethereum.org/en/whitepaper/ .