The Synchronization Tax
[Updated 6 February 2026. -MFM]
A physicist, a computer scientist, and a banker walk into a bar. They cannot agree on who got there first.
This is not a joke. This is a central problem of physics, computer science, and economics.
The physicist points out that simultaneity is relative — who was there "first" depends on the observer's velocity. The computer scientist notes that their phones' clocks differ by milliseconds, enough to corrupt a database. Who knows which clock is accurate? The banker shrugs and says it doesn't matter who got there first; what matters is who pays first.
They are all concerned with the same problem: what we call "first" is not the result of a river of time that carries us forward. Time is a cost — specifically, the relative entropy two systems must produce when they force their descriptions of reality into agreement.
Whether you are collapsing a quantum wavefunction, synchronizing a distributed database, or clearing a wire transfer, the mechanism is similar. To understand how, we have to let go of our intuitions about time and look under the hood.
The Illusion of a Universal Clock
Our intuitions about time go back to Isaac Newton. He imagined time that "flows equably without relation to anything external" at the same rate for everyone, everywhere. On this view, the universe is a synchronous computer, every atom updating its state on the exact same clock cycle.
Einstein broke this clock. He showed us that time is relative to velocity with light setting the absolute speed limit. Then quantum mechanics revealed to us that at a small enough scale, even properties like position and momentum cannot be known until measured. This left us with a startling question: if there is no universal clock, then how does the universe keep its story straight? How does a photon know to hit the detector after the laser fired?
The Physicist's View: Time is Relative Entropy
In 1993, Carlo Rovelli proposed a radical idea: time is not a fundamental variable of the universe. At the microscopic level — individual atoms interacting — there is no "past" and no "future." There is only a network of correlations. Rovelli has showed that what we perceive as "time" is actually a measure of our ignorance.
Before the interaction, the photon's description of itself and the detector's description of the photon are decoupled — no relative entropy exists between them because no comparison has occurred. The click forces a comparison. The detector's state must now encode something about the photon's state, and that encoding produces a divergence between the detector's prior description (ignorance) and its posterior description (a definite outcome). That divergence — a relative entropy — is the irreducible cost of establishing a shared fact. It is also what we experience as a moment of time.[1]
How? Because the detector clicked. To click, the detector had to change its state irreversibly. It generated heat. It increases the entropy of the room. This dissipation left its mark on the universe — a thermodynamic record.
This leads us to a profound truth: we pay for reality to unfold with entropy. The reason we can say "Event A happened before Event B" is because we turned energy into entropy in recording it. The flow of time is the thermodynamic cost of verifying the universe's status.
The Computer Scientist's View: Time as Messaging
Over a decade before Rovelli began dismantling time in physics, the computer scientist Leslie Lamport was doing the same thing in computer science.
In 1978, Lamport published one of the most important papers in computer science history. He was solving a practical problem: how do you get a cluster of computers to agree on which transaction happened first? If Computer A is in New York and Computer B is in Tokyo, the speed of light guarantees a delay between them. If both update a database at roughly the same moment, who "wins"?
Lamport realized that in a distributed system, physical timestamps are useless. You cannot trust the clock on the wall because synchronization is never perfect.
Instead, he defined time as a partial ordering of events. We can only say "Event A happened before Event B" if there is a causal link — a message — sent from A to B. Each message carries a cost: the sender and receiver must update their models of each other, reducing the divergence between their descriptions of the shared state. Lamport's logical clock counts these updates. Time, in his framework, advances by units of relative entropy reduced.
If I write a letter and you receive it, my writing happened before your reading. If I eat breakfast and you eat breakfast and we never communicate, our breakfasts are concurrent. There is no fact about which happened first.
This is the same answer given us by quantum mechanics. In the absence of an interaction (a message), systems are concurrent — superposed. They share no timeline.
Time, for Lamport, is not a background variable. It is a directed graph of messages. Time is what happens when we communicate.
The Banker's View: Money is Memory
This brings us to the banker's view of markets and money.
If the universe is a distributed system without a global clock, then the economy should be a complete disaster. Billions of human "nodes" — buyers and sellers — operate concurrently, each with private desires, private assets, private histories. How do we synchronize our activity? How do we prevent the double-spend problem, where I trade my only chicken for your shoes, then trade the same chicken for a neighbor's bread?
In a primitive barter economy, synchronization required a handshake — a direct interaction between two people. This is a private reality. If I promise to pay you later, that debt exists only in our shared memory, a "relative fact" in Rovelli's language. But as the economy scales, we need a way to make private facts public. We need a global clock.
We call that clock money.
In 1998, the economist Narayana Kocherlakota showed that if we had a perfect, magical distributed database everyone could access instantly — a universal ledger recording every favor and debt in history — we would not need money. If I wanted your shoes, I wouldn't hand you a green piece of paper. I would take the shoes, and the Great Database would update my status to "Owes Society Value" and yours to "Owed Value."
We use money because we lack that magic database. To put it in the physicist's terms: the relative entropy between my model of you and your actual history is enormous.
When I meet you on the street, my model of your creditworthiness is crude — I cannot access your past transactions, your reputation, your debts. You are, to me, a superposition of trustworthy and untrustworthy until some interaction brings our descriptions into alignment.
Money mediates that collapse. Instead of verifying your entire past (computationally expensive), I verify your token. If you have the dollar, I assume you earned it. The dollar is a physical proof-of-work, a compressed artifact of your history that substitutes for the fine-grained knowledge I cannot access.
The Synchronization Tax
When you swipe your credit card, you witness the collision of these three worlds.
First, concurrency: you and the merchant are two systems with undefined relative states. You want the coffee; he wants the value.
Second, interaction: you tap the terminal. A message is sent. You are attempting to establish a "happened before" relationship—payment, then coffee.
Third, collapse: the banking system acts as observer. It checks the ledger. It dissipates energy (servers running, fees charged). It forces the universe to decide: did this transaction happen?
Once the bank approves, uncertainty collapses. The money moves. The coffee is yours. Private desire has become public fact.
There is a non-trivial sense in which banks, lawyers, and stock exchanges all exist for this reason. They are the decoherence machinery of society, the "classical limit" that prevents the economy from remaining in an unresolved superposition of fraud and promises.[2]
But this coherence costs us something. In physics, we pay in relative entropy — the divergence between two systems' descriptions that must be reconciled for them to share a fact. In finance, we pay with fees. Every transaction fee, every spread, every billable hour is a synchronization tax — the cost of bringing two asynchronous descriptions of reality into local agreement.
The Luxury of the Present
We resent these costs. We hate the fees. We hate waiting for wires to clear. We hate the feeling that time is running out. But perhaps we should understand all of these experiences in a different way. That we share a "now" at all is a miracle of engineering.
In the lossless limit — in the deep silence of an isolated quantum state — there is no time, no reality, only a solitary possibility of what might be. To have a reality — a coffee, a conversation, a career — we must interact. We must produce the relative entropy that comes from forcing two independent descriptions of the world into agreement. That divergence is not merely the price of coordination. Recent work suggests it may be what generates the structures we coordinate through — including gravity itself.
Time is not a river that carries us. Time is a bridge we build, transaction by transaction, over the void.
Things get more complicated when the photon is entangled with another system. Remarkably, it is possible to detect the presence of an object — even a bomb — without the probe photon ever being absorbed by it. See Kwiat, P. G., White, A. G., Mitchell, J. R., Nairz, O., Weihs, G., Weinfurter, H., & Zeilinger, A. (1999). High-efficiency quantum interrogation measurements via the quantum Zeno effect. Physical Review Letters, 83(23), 4725–4728. Available at https://arxiv.org/abs/quant-ph/9909083 ↩︎
The Coase Theorem describes the lossless limit in which every transaction is reversible. Oliver Williamson's asset specificity describes entanglements, in which the separation of firm and asset generates entropy. Firms, on this view, represent a coherent domain in which people share a common clock: the rules, norms, and hierarchy that manifests in culture. The synchronization tax is reduced through shared context. Ginestra Bianconi's "Gravity from Entropy" theory formalizes an analogous structure in physics: gravity emerges from the quantum relative entropy between the true spacetime metric and the metric matter fields would induce, with Einstein's equations appearing in the low-energy limit where the divergence is small. See also this clarification on terminology for economists. The "synchronization tax" is not equivalent to "transactions costs" or "coordination costs" as those terms are used in the economics literature. ↩︎