Last night I started watching the HULU show Devs, starring Nick Offerman.
In the first episode, we learn that Nick’s character – Forest is building a quantum computing powered machine to model the universe to a great degree of fidelity.

In the scene pictured above, Forest is launching into a monologue that Universe is deterministic. If he indeed succeeded in building a working model of the universe with predictive powers, then indeed, his conclusion would be accurate. I haven’t watched the whole series, but I suppose we will learn more.
In a second episode, we hear that they have a problem with the „fidelity” part. Their modeling is crude, and one proposed explanation is that:
modeling the universe may need an entire universe – one qubit per particle.
Why I don’t think this is possible
What struck me as wrong in this statement is that even if we somehow get a universe-size quantum computer with each particle tracked, the universe is made from more than particles.
In the model proposed in the series, we would model all of classical physics, but what about all the quantum soup?
What about the modeling machine itself and it’s interference? How do we model all those qubits?
The universe is a second-order chaotic system, which means that it reacts to predictions. This aspect is beaten to death in every pop-culture product featuring time travel, and implications are well-explored in the grandfather paradox.
Another Science-Fiction cliche, the Shroedinger’s Cat experiment presents problems with the Copenhagen interpretation of quantum physics:
The scenario presents a hypothetical cat that may be simultaneously both alive and dead, a state known as a quantum superposition, as a result of being linked to a random subatomic event that may or may not occur.
The Copenhagen interpretation states, that the „randomness” of the uncertainty principle collapses to a state on observation. Deterministic universe (and an ability to run a simulation) would require us to predict this collapse.
Modeling only particles, how can we predict that? And since we cannot discount non-particles from the simulation, the question is:
Can the Universe be modeled using the tools we have in this universe?
Bootstrapping compilers
Bootstrapping is a process of compiling a new programming language. The idea is to write a programming language (they are created as any other program – people write them) using the language itself. Then, a bootstrapping compiler will compile the kernel of the „real” compiler, and it will take it from there.
The bootstrapping compiler is written in a different language – a one that already has a compiler (or assembly language that does not need one).
So my question may be translated: Is the universe a bootstrapping compiler or does it need one?
Living in a deterministic universe means no free will.
Humans are meaning-making machines. When Elon Musk proposed, that we may live in a simulation, people felt a little taken aback. However, it does not really matter if we do – and SingularityHub has a good explanation of why.
We will never be able to access „the layer running the simulation”, so for intents and purposes – it does not matter, it’s purely theoretical.
That is also why we won’t be able to model THIS universe. We may be able to model A universe – in that sense, we are a bootstrapping compiler. We may be a model running in some other universe – a bootstrapping compiler for us.
The term „bootstrapping” came from the expression „pulling oneself by one’s bootstraps”:
Widely attributed to The Surprising Adventures of Baron Munchausen, (1781) where the eponymous Baron pulls himself out of a swamp by his hair (specifically, his pigtail), though not by his bootstraps; misattribution dates to US, 1901
Wiktionary
Bootstrapping itself is used as an example of a tall tale – something impossible and contradictory.
Even if it’s impossible, it will prompt us to learn more about reality
My coworkers and I had a chance to listen to a private keynote by Stephen Wolfram. While explaining the story behind Mathematica, he shared a quote that stuck with me:
If we are building models, one of the things we could make the model of is the universe
Stephen Wolfram
Artur, let me address your doubts. Unfortunately, my answer won’t be well researched, just off the to of my head, but I hope it can suffice.
0) First, a purely linguistic note: I assume, when you say “simulation”, you mean “emulation”, i.e. a 100% accurate modeling, bit by bit. A flight _simulator_ does not run a perfect reproduction of an airplane including every one of its atoms, but a simplified, abstracted model that, for the purposes of training, gets you 80% of the result for 20% of the cost. An approximation.
1) The characters in the series don’t claim to have constructed a machine to emulate (accurately model) the whole universe. When they say “modeling the universe may need an entire universe – one qubit per particle”, they mean: “It’s basically impossible. We can’t construct a computer the size of the universe.” That statement you quoted is used as a counterargument for the idea of making a perfect universe emulator.
In fact, Devs have built a high-fidelity simulator. If I recall the dialogue correctly, Forest demands the impossible and receives pushback.
2) It is tempting to apply time-travel parallels to determinism, but it doesn’t really apply here. In time travel you assume to have one flow of history, that is then edited by someone who goes back in time and pushes events in another direction.
On the other hand, when someone uses a high-fidelity machine to predict a deterministic future and act on this prediction – the course of history is not changed. In fact – it was always going to happen this way. They were always going to find out that prediction and act on it and the result that they saw in the prediction was in fact also a result of them acting on the prediction. It was all predetermined. 🙂 There is no paradox, but a perfectly linear chain of events.
I don’t think I need to address quantum collapse or bootstrapping. They have not built a universe emulator.