Discover.com
What’s the News: Anyone who has had their thighs baked by a laptop knows that computing releases heat. And it’s more than a common-sense maxim: physicists have shown that heat released by information processing is bound by a physical law, where a bit of information processed must cause a corresponding rise in temperature. But could quantum mechanics allow computations that actually cool computers down? In a recent Nature paper, researchers describe how this paradox is possible.
How the Heck:
- In this paper, the team describes how, using the quantum mechanical property of entanglement, an observer can actually drain heat from a system while deleting information.
- How, you say? It all comes down to a question of entropy. The second law of thermodynamics states that the entropy of a system is always increasing or remaining the same, but never decreasing. And the way we usually experience it, entropy is heat. Landauer’s Principle, which arises from the second law, links heat and information processing: any irreversible computation, the principle says, is going to add entropy to the universe.
- But if a computation is reversible—if a 1 or 0 is deleted while its state is recorded somewhere, thus making it possible to recreate it—then no entropy should be released at all. Physicists have confirmed this math in the past.
- Now researchers have shown that in quantum mechanics, entropy can be seen as a lack of knowledge on the part of the observer (that is, the experimenter) about the state of the 1s and 0s. In a quantum quirk, when an observer is entangled with bits of information, he has a great deal of detailed knowledge about those bits—so much, in fact, that entropy in the system goes into the negative numbers. Thus, when the observer makes a reversible deletion, he is actually siphoning heat off from the system. Voila: A computation that cools.
|