For decades, philosophers, technologists, and even billionaires like Elon Musk have toyed with the idea that our universe might be a highly sophisticated computer simulation. The idea represents one of modern science’s most haunting and enduring hypotheticals, a kind of digital cosmology.

However, according to a study by an international team of theoretical physicists, the mathematics itself may finally settle the debate. Their conclusion? It’s impossible that we’re living in a simulation.

In a paper published in the Journal of Holography Applications in Physics, researchers argue that the same mathematical principles that limit what computers can calculate also limit what any simulation of the universe could ever reproduce. In short, the universe contains truths that no algorithm—no matter how advanced—can ever compute.

“It has been suggested that the universe could be simulated. If such a simulation were possible, the simulated universe could itself give rise to life, which in turn might create its own simulation,” lead author and theoretical physicist at the University of British Columbia, Dr. Mir Faizal, said in a statement. “This recursive possibility makes it seem highly unlikely that our universe is the original one, rather than a simulation nested within another simulation.”

“This idea was once thought to lie beyond the reach of scientific inquiry. However, our recent research has demonstrated that it can, in fact, be scientifically addressed.”

The idea that reality could be computed—sometimes called “It from Bit,” after physicist Dr. John Archibald Wheeler’s phrase—has long fascinated scientists. In this view, the cosmos itself could be described as a vast informational process: every atom, photon, and galaxy, a pixel in a cosmic program.

Yet Dr. Faizal and his co-authors—renowned cosmologist Dr. Lawrence M. Krauss, Dr. Francesco Marino of Italy’s National Institute of Optics, and researcher Arshid Shabir—argue that this idea collapses under the very laws of logic itself.

Using the mathematical frameworks of Kurt Gödel, Alfred Tarski, and Gregory Chaitin, researchers demonstrated that any “Theory of Everything” built entirely on computation must, by definition, be incomplete.

Gödel’s incompleteness theorems, for example, proved that in any sufficiently complex mathematical system, there are true statements that can never be proven using the rules of that system.

Tarski’s theorem showed that truth itself cannot be defined entirely within a formal language. And Chaitin’s work revealed that some mathematical truths are fundamentally uncomputable—they contain more information than any algorithm can encode.

The researchers argue that these theorems also apply to the foundations of physics. Any theory of quantum gravity—the long-sought framework that unifies general relativity and quantum mechanics—would be a kind of algorithmic system. And just like mathematics, it would inevitably face its own unprovable truths.

“Together, the Gödel–Tarski–Chaitin triad delineates an insurmountable frontier for any strictly computable framework,” the researchers write.

In other words, if the laws of physics can’t be reduced to pure computation, then neither can the universe.

Rather than seeing this limitation as a failure, the researchers frame it as an opportunity to redefine what science itself can explain, proposing what they call a “Meta-Theory of Everything” (MToE)—a framework that transcends computation entirely by including non-algorithmic understanding.

This meta-theory introduces what the authors describe as an “external truth predicate”—a type of logical mechanism that recognizes truths that no calculation can capture. In this view, the universe operates partly on principles that lie beyond any algorithmic description, suggesting that physical reality is richer than information alone can describe.

While this may sound more mystical than scientific, researchers argue that undecidable or uncomputable processes are already evident in known physics.

Determining whether a quantum system will reach thermal equilibrium—known as the “quantum thermalization problem”—has been proven undecidable. Even deciding whether a complex quantum material is gapped or gapless embeds the famous “halting problem,” a cornerstone of computational limits.

“Whenever an experiment or exact model realizes a property whose truth value provably eludes every recursive procedure, that property functions as a concrete witness to the truth predicate operating within the fabric of the universe itself,” the researchers write. “Far from being a purely philosophical embellishment, MToE [Meta-Theory of Everything] thus emerges as a structural necessity forced upon us by the physics of undecidable observables.”​

Put simply, reality itself seems to “know” things no computer could.

Philosophers like Nick Bostrom have argued that if civilizations survive long enough, they’ll develop computers powerful enough to simulate entire universes—and that it’s statistically likely we already live in one. However, the new paper also dismantles that logic at its root.

Simulations, by definition, rely on computation. They execute algorithms. Yet if the fundamental structure of the universe includes non-algorithmic truths—facts that no finite calculation can capture—then no simulation could ever reproduce them.

“Because [our] Meta-Theory of Everything contains an external truth predicate that by construction escapes formal verification, any finite algorithm can at best emulate FQG [the computable fragment of physics] while systematically omitting the meta-theoretic truths enforced by it,” the researchers explain. “Consequently, no simulation could in principle reproduce what would otherwise be the full underlying structure of the physics of our universe.”


Lunar Reconnaissance Orbiter


That means any simulated universe would be missing something essential—something that real physics includes but computation or a simulation cannot. The team’s conclusion is as philosophically profound as it is mathematically grounded. It suggests that the deepest layer of reality is not composed of matter (its) or information (bits), but of something even more fundamental: understanding.

“Neither ‘its’ nor ‘bits’ may be sufficient to describe reality,” the researchers write. “Rather, a deeper description, expressed not in terms of information but in terms of non-algorithmic understanding, is required for a complete and consistent theory of everything.”

Importantly, this approach doesn’t discard science. By acknowledging that not all truths are computable, Dr. Faizal and his co-authors say they’re preserving science’s ultimate goal: to explain everything that can, in principle, be known—even if it can’t all be calculated.

The argument also underscores a growing recognition that the unknown still vastly outweighs what we can explain. In this sense, their work contributes to a growing body of recent studies that challenge the assumption that reality can be neatly reduced to equations or simulations.

As The Debrief recently reported, a study suggesting that life’s emergence was “cosmologically implausible”  raises questions about the existence of biology itself, which appears to defy straightforward probabilistic reasoning.

Like that work, Dr. Faizal and his co-authors’ paper hints that the universe may harbor principles or processes that lie fundamentally beyond computation or statistical prediction.

So while simulation theory will likely remain a cultural touchstone—from The Matrix to online debates and pop-science podcasts—this new study provides a rigorous mathematical argument against it.

Ultimately, researchers demonstrate that logic doesn’t merely cast doubt on the simulation hypothesis—it renders it logically impossible. The universe, they contend, is not a simulation running on a cosmic computer. It’s something that transcends computation altogether.

“The fundamental laws of physics cannot be contained within space and time, because they generate them,” co-author Dr. Lawrence M. Krauss explained. “It has long been hoped, however, that a truly fundamental theory of everything could eventually describe all physical phenomena through computations grounded in these laws.”

“Yet we have demonstrated that this is not possible. A complete and consistent description of reality requires something deeper—a form of understanding known as non-algorithmic understanding.”

Tim McMillan is a retired law enforcement executive, investigative reporter and co-founder of The Debrief. His writing typically focuses on defense, national security, the Intelligence Community and topics related to psychology. You can follow Tim on Twitter: @LtTimMcMillan.  Tim can be reached by email: tim@thedebrief.org or through encrypted email: LtTimMcMillan@protonmail.com 

Comments are closed.