In case you had not noticed, computers are hot—literally. A laptop can pump out thigh-baking heat, while data centers consume an estimated 200 terawatt-hours each year—comparable to the energy consumption of some medium-sized countries. The carbon footprint of information and communication technologies as a whole is close to that of fuel use in the aviation industry. And as computer circuitry gets ever smaller and more densely packed, it becomes more prone to melting from the energy it dissipates as heat.
Now physicist James Crutchfield of the University of California, Davis, and his graduate student Kyle Ray have proposed a new way to carry out computation that would dissipate only a small fraction of the heat produced by conventional circuits. In fact, their approach, described in a recent preprint paper, could bring heat dissipation below even the theoretical minimum that the laws of physics impose on today’s computers. That could greatly reduce the energy needed to both perform computations and keep circuitry cool. And it could all be done, the researchers say, using microelectronic devices that already exist.
In 1961 physicist Rolf Landauer of IBM’s Thomas J. Watson Research Center in Yorktown Heights, N.Y., showed that conventional computing incurs an unavoidable cost in energy dissipation—basically, in the generation of heat and entropy. That is because a conventional computer has to sometimes erase bits of information in its memory circuits in order to make space for more. Each time a single bit (with the value 1 or 0) is reset, a certain minimum amount of energy is dissipated—which Ray and Crutchfield have christened “the Landauer.” Its value depends on ambient temperature: in your living room, one Landauer would be around 10–21 joule. (For comparison, a lit candle emits on the order of 10 joules of energy per second.)
Computer scientists have long recognized that Landauer’s limit on how little heat a computation produces can be undercut by not erasing any information. A computation done that way is fully reversible because throwing no information away means that each step can be retraced. It might sound as though this process would quickly fill up a computer’s memory. But in the 1970s Charles Bennett, also at T. J. Watson, showed that instead of discarding information at the end of the computation, one could set it up to “decompute” intermediate results that are no longer needed by reversing their logical steps and returning the computer to its original state.
The catch is that, to avoid transferring any heat—that is, to be what physicists call an adiabatic process—the series of logical operations in the computation must usually be carried out infinitely slowly. In a sense, this approach avoids any “frictional heating” in the process but at the cost of taking infinitely long to complete the calculation.
It hardly seems a practical solution, then. “The conventional wisdom for a long time has been that the energy dissipation in reversible computing is proportional to speed,” says computer scientist Michael Frank of Sandia National Laboratories in Albuquerque, N.M.
To the Limit—And Beyond
Silicon-based computing does not get near the Landauer limit anyway: currently such computing produces around a few thousands of Landauers in heat per logical operation, and it is hard to see how even some superefficient silicon chip of the future could get below 100 or so. But Ray and Crutchfield say that it is possible to do better by encoding information in electric currents in a new way: not as pulses of charge but in the momentum of the moving particles. They say that this would enable computing to be done reversibly without having to sacrifice speed.
The two researchers and their co-workers introduced the basic idea of momentum computing last year. The key concept is that a bit-encoding particle’s momentum can provide a kind of memory “for free” because it carries information about the particle’s past and future motion, not just its instantaneous state. “Previously, information was stored positionally: ‘Where is the particle?’” says Crutchfield. For example, is a given electron in this channel or that one? “Momentum computing uses information in position and in velocity,” he says.
This extra information can then be leveraged for reversible computing. For the idea to work, the logical operations must happen much faster than the time taken for the bit to come into thermal equilibrium with its surroundings, which will randomize the bit’s motion and scramble the information. In other words, “momentum computing requires that the device runs at high speed,” Crutchfield says. For it to work, “you must compute fast”—that is, nonadiabatically.
The researchers considered how to use the idea to implement a logical operation called a bit swap, in which two bits simultaneously flip their value: 1 becomes 0, and vice versa. Here no information is discarded; it is just reconfigured, meaning that, in theory, it carries no erasure cost.
Yet if the information is encoded just in a particle’s position, a bit swap—say, switching particles between a left-hand channel and right-hand one—means that their identities get scrambled and therefore cannot be distinguished from their “before” and “after” states. But if the particles have opposite momenta, they stay distinct, so the operation creates a genuine and reversible change.
A Practical Device
Ray and Crutchfield have described how this idea might be implemented in a practical device—specifically, in superconducting flux quantum bits, or qubits, which are the standard bits used for most of today’s quantum computers. “We’re being parasites on the quantum computing community!” Crutchfield merrily admits. These devices consist of loops of superconducting material interrupted by structures called Josephson junctions (JJs), where a thin layer of a nonsuperconducting material is interposed between two superconductors.
The information in JJ circuits is usually encoded in the direction of their so-called supercurrent’s circulation, which can be switched using microwave radiation. But because supercurrents carry momentum, they can be used for momentum computing, too. Ray and Crutchfield performed simulations that suggest that, under certain conditions, JJ circuits should be able to support their momentum computing approach. If cooled to liquid-helium temperatures, the circuitry could carry out a single bit-swap operation in less than 15 nanoseconds.
“While our proposal is grounded in a specific substrate to be as concrete as possible and to accurately estimate the required energies,” Crutchfield says, “the proposal is much more general than that.” It should work, in principle, with normal (albeit cryogenically cooled) electronic circuits or even with tiny, carefully insulated mechanical devices that can carry momentum (and thus perform computation) in their moving parts. An approach with superconducting bits might be particularly well suited, though, Crutchfield says, because “it’s familiar microtechnology that is known to scale up very well.”
Crutchfield should know: Working with Michael Roukes and his collaborators at the California Institute of Technology, Crutchfield has previously measured the cost of erasing one bit in a JJ device and has shown that it is close to the Landauer limit. In the 1980s Crutchfield and Roukes even served as consultants for IBM’s attempt at building a reversible JJ computer, which was eventually abandoned because of what were, at the time, overly demanding fabrication requirements.
Follow the Bouncing Ball
Harnessing a particle’s velocity for computing is not an entirely new idea. Momentum computing is closely analogous to a reversible-computing concept called ballistic computing that was proposed in the 1980s: in it, information is encoded in objects or particles that move freely through the circuits under their own inertia, carrying with them some signal that is used repeatedly to enact many logical operations. If the particle interacts elastically with others, it will not lose any energy in the process. In such a device, once the ballistic bits have been “launched,” they alone power the computation without any other energy input. The computation is reversible as long as the bits continue bouncing along their trajectories. Information is only erased, and energy is only dissipated, when their states are read out.
Whereas, in ballistic computing, a particle’s velocity simply transports it through the device, allowing the particle to ferry information from input to output, Crutchfield says, in momentum computing, a particle’s velocity and position collectively allow it to embody a unique and unambiguous sequence of states during a computation. This latter circumstance is the key to reversibility and thus low dissipation, he adds, because it can reveal exactly where each particle has been.
Researchers, including Frank, have worked on ballistic reversible computing for decades. One challenge is that, in its initial proposal, ballistic computing is dynamically unstable because, for example, particle collisions may be chaotic and therefore highly sensitive to the tiniest random fluctuations: they cannot then be reversed. But researchers have made progress in cracking the problems. In a recent preprint paper, Kevin Osborn and Waltraut Wustmann, both at the University of Maryland, proposed that JJ circuits might be used to make a reversible ballistic logical circuit called a shift register, in which the output of one logic gate becomes the input of the next in a series of “flip-flop” operations.
“Superconducting circuits are a good platform for testing reversible circuits,” Osborn says. His JJ circuits, he adds, seem to be very close to those stipulated by Ray and Crutchfield and might therefore be the best candidate for testing their idea.
“I would say that all of our groups have been working from an intuition that these methods can achieve a better trade-off between efficiency and speed than traditional approaches to reversible computing,” Frank says. Ray and Crutchfield “have probably done the most thorough job so far of demonstrating this at the level of the theory and simulation of individual devices.” Even so, Frank warns that all the various approaches for ballistic and momentum computing “are still a long way from becoming a practical technology.”
Crutchfield is more optimistic. “It really depends on getting folks to support ramping up,” he says. He thinks small, low-dissipation momentum-computing JJ circuits could be feasible in a couple of years, with full microprocessors debuting within this decade. Ultimately, he anticipates consumer-grade momentum computing could realize energy-efficiency gains of 1,000-fold or more over current approaches. “Imagine [if] your Google server farm housed in a giant warehouse and using 1,000 kilowatts for computing and cooling [was instead] reduced to only one kilowatt—equivalent to several incandescent light bulbs,” Crutchfield says.
But the benefits of the new approach, Crutchfield says, could be broader than a practical reduction in energy costs. “Momentum computing will lead to a conceptual shift in how we see information processing in the world,” he says—including how information is processed in biological systems.