New Proof Dramatically Compresses Space Needed for Computation

Surprising new work bucks 50 years of assumptions about the trade-offs between computation space and time

illustration of running computer chip

Thomas Fuchs

Join Our Community of Science Lovers!

Once upon a time computers filled entire rooms, reading numbers from spinning tapes and churning them through wires to do chains of basic arithmetic. Today they slip into our pockets, performing in a tiny fraction of a second what used to take hours. But even as chips shrink and gain speed, theorists are flipping the question from how much computation space we can pack into a machine to how little is enough to get the job done.

This inquiry lies at the heart of computational complexity, a measure of the limits of what problems can be solved and at what cost in time and space. For nearly 50 years theorists could only prove that if solving a problem takes t steps, it should be possible using roughly t bits of memory—the 0’s and 1’s that a machine uses to record information. [Technically, that equation also incorporates log(t), but for the numbers involved, this has little effect.] If a task requires 100 times the steps of another one, say, you’d expect to need about 100 times the bits, enough to diligently note each step. Using fewer bits was thought to require more steps—like alphabetizing books by swapping them one by one on the shelf instead of pulling them all out and re-shelving them. But in a finding described at the ACM Symposium on Theory of Computing in Prague, Massachusetts Institute of Technology computer scientist Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a computation requiring 100 times the steps could be compressed and solved with something on the order of 10 times more bits. “This result shows the prior intuition is completely false,” Williams says. “I thought something must be wrong [with the proof] because this is extremely unexpected.”*

The breakthrough relies on a “reduction,” a means of transforming one problem into another that may seem unrelated but is mathematically equivalent. With reductions, packing a suitcase maps onto determining a monthly budget: the size of your suitcase represents your total budget, pieces of clothing correspond to potential expenses, and carefully deciding which clothes can fit is like allocating your budget. Solving one problem would then directly solve the other. This idea is at the core of Williams’s result: any problem can be transformed into one you can solve by cleverly reusing space, deftly cramming the necessary information into just a square-root number of bits. Thus, the original problem must be solvable with this compact container.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


“This progress is unbelievable,” says Mahdi Cheraghchi, a computer scientist at the University of Michigan. “Before this result, there were problems you could solve in a certain amount of time, but many thought you couldn’t do so with such little space.” Williams’s finding, he adds, is “a step in the right direction that we didn’t know how to take.”

While computers have continued to shrink, our theoretical understanding of their efficiency has exploded, suggesting that the real constraint is not how much memory we have but how wisely we use it.

*Editor’s Note (7/23/25): This paragraph was corrected after posting to offer a better example of how the relationship between steps and bits would work. It was previously amended on July 16 to clarify that log(t) would have little effect on the term.