Breaking News
Loading...
Monday 17 December 2012

Info Post
And the lowest allowed power consumption of PCs

In our discussions about information and heat, James Gallagher said some of the usual wrong things about irreversibility – for example, he believes that the proof of the H-theorem is invalid because of the molecular chaos assumption (this assumption is a pure technicality allowing explicit calculations but the overall conclusion, the increasing entropy, is independent of any such Ansatz!).

However, he has also made a statement about an algorithm to reduce the entropy with the help of his PC:
I mean I can simulate deterministic dynamical systems on my computer and reverse all the dynamics at any time - which MUST then result in a decreasing entropy if the previous system had increasing entropy.
I assure you, James, that your method doesn't work. What you suggested has been known as Maxwell's daemon and the 20th century analyses have made it clear that no such proposed device may actually reduce the total entropy.




This blog has discussed Maxwell's daemon many times. See, for example,
Feynman on the arrow of time

Arrow of time understood for 100 years

Maxwell's daemon cannot do useful work
Recall that in his 1964 Messenger Lectures at Cornell, Feynman showed that Maxwell's daemon couldn't work because the "wheels with teeth" that were meant to undo the balance of a physical system ultimately worked so that they transmitted energy/heat from the warmer body to the cooler one, as expected, so the entropy goes up whether you like it or not.

Because James' comment shows that people, including frequent visitors of physics blogs, still haven't noticed that such devices cannot work, I decided to write one more blog entry of the sort and add some interesting related ideas that haven't been written on TRF yet.



Maxwell's daemon is a hypothetical agent or device that operates inside a thermodynamic system – in this case a vessel divided to two parts – and it does something intelligent in order to encourage processes that "naturally" occur in the opposite direction only.

For example, the daemon may open the door whenever a faster, hotter molecule is coming from the left, so that it gets to the right, and it may similarly encourage slower, cooler molecules to be concentrated in the left part. Or it may just try to concentrate all the gas molecules in one part. Or sort them in another way.

Whatever details we choose, the point is that the daemon is reducing the entropy of the gas (or another object). It is effectively able to increase temperature differences between the two parts of the vessel even though in Nature, temperatures tend to get more uniform as time goes by. Or the daemon may do something else that doesn't seem to happen naturally.

If such a daemon were possible, the advantage would be clear. We could construct the perpetual motion machine of the second kind because we could increase the temperature differences between the two parts of the vessel and use a part of the difference to do mechanical work. Well, in the case of the daemon that just spontaneously concentrates all the molecules in one part, we would seemingly construct the perpetual motion machine of the first kind because the pressure difference could be used to do mechanical work "immediately". However, if you look carefully, it would still be the perpetual motion machine of the second kind because the temperature of the gas would go down as you would be extracting work out of it.

Now, can the daemon exist? You may say that Maxwell's daemon is a metaphor for the government. So the people who are deluded and believe that the government may "social-engineer" things that work more effectively than the free market, may also believe that Maxwell's daemon that violates the second law may be produced and launched. Well, it cannot, those of us who understand the basic laws of thermodynamics and economics know.

But for a while, Maxwell's daemon may have been viewed as another "giant" who may perhaps "beat" the laws of thermodynamics. It wasn't quite clear who would win: the laws of thermodynamics, or Maxwell's proposed cleverness? Note that Maxwell designed the thought experiment as an example of a new effect that becomes possible when we replace the approximate laws of thermodynamics by their precise microscopic realization in terms of statistical physics. Such improvements of the foundations of physics often lead to new possibilities, so Maxwell's daemon could have been possible, couldn't it?

Leo Szilárd published the first article saying "No, the daemon won't work" in 1929. The paper said many nice things that were partly right. Equally importantly, the connection between the entropy and information appeared very clearly in that paper. He was able to say that doing something with the information of 1 bit was changing some entropy (or some part of it) by\[

S = k\ln 2.

\] The entropy equals Boltzmann's constant multiplied by the natural logarithm of two (nats). Also, he would figure out that some operations doing something with one bit created or moved energy or heat \(\Delta E = \pm kT\ln 2\) somewhere. I am being deliberately vague here because at this vague level, Szilárd was right.

(Did you know that Szilárd has mastered the method of getting grants by blackmailing that he would publish a paper on how to build your nuclear bomb? He was a proponent of bribing politicians to improve the world, too.)

The exact details about the moment when the entropy increases and guarantees that the total entropy can't go down were slightly wrong in his paper. He essentially believed that the entropy associated with the daemon (=the expenses of the government) had to increase when the information about the molecules was being accumulated.

These days, it seems that a different accounting of the entropy increase is much more convincing. The more precise explanation what's going on emerged as Landauer's principle in the 1961 paper by Rolf Landauer of IBM. See also a 1981 paper by Charles Bennett or newer lectures by John Preskill for some modern comments.

Landauer realized that the entropy measures the information about the microscopic arrangement that has been lost, that has become inaccessible. And this part of the information is actually not increasing when a computer accumulates the information about the molecules in the vessel. Instead, it is being lost when the computer erases its memory which it needs to do at some point before it accumulates new data – at least assuming that the computer's memory is finite.

(If the computer has an insanely high or "infinite" memory, one may assign its state an entropy in such a way that the second law will continue to hold even though the memory never has to be erased. At that moment, the validity of the second law may look vacuous or convention-dependent but that only occurs if we assume some unphysical assumptions.)

It's funny to look at the computers' minimal power consumption dictated by Landauer's principle. If your fast computer (well, one you may have in a few years) needs to erase one trillion bits per second, \(kT\ln 2\) per bit will tell you that the minimal consumption of such a computer at the room temperature is 2.85 nanowatts. Clearly, the actual microprocessors and memory chips still consume many orders of magnitude more energy than that – which is why it makes no practical sense to try to construct "reversible computers" that don't erase things and that could circumvent the Landauer bound. The experts say that such "reversible computation" is possible in principle. I have some modest doubts about it but I won't clarify them.

While the Landauer limit is much smaller a power consumption than what we seem to see in the real world, I feel that it may be necessary for the power consumption to be much higher than the Landauer limit if the computers are supposed to work flawlessly. For the bits of information inside the computer to behave as classical bits, I think it is necessary to copy them many times – in the sense of decoherence. Only when the information is copied many times, the classical-bit approximation becomes OK. For this reason, I would guess that the minimal consumption of reliable enough classical computers will always be greater than the Landauer minimum by at least an order of magnitude.

Nevertheless, we still have lots of room to reduce the power consumption. And one additional simple way to reduce \(kT\ln 2\) is to reduce the temperature \(T\). Highly cooled computers could consume less energy if the Landauer bound ever became relevant. Of course, this discussion isn't useful for your PC at home because the cooling systems needed to get this low would probably consume much more energy than your PC so you wouldn't get much.

In his belief that it's easy to "reverse any evolution" with the help of a computer and therefore to reduce entropy, James Gallagher makes a kind of isomorphic mistake as the proponents of Keynesianism, socialism, communism, and related crackpot theories in economics do. They overlook the expenses of the government and the inefficiency that the government itself brings to the system. Well, that's a pretty serious mistake.

Analogously, James thinks that the computer operates "for free" and doesn't create any entropy. But the point here is that the computer is a physical object, much like the government bureaucrats are people who still behave to maximize their utility function. When one does a proper analysis that includes the computer or the government offices, it becomes completely clear that Maxwell's daemon and the government simply cannot work to transform their utopia into reality.

James was thinking about a device that probably measures the positions and velocities – or the quantum state? – of all the molecules in the vessel and does a "global calculation" before it adjusts the motion of all the molecules in the vessel. This may look ambitious or different than Maxwell's daemon but if it is looked at rationally, it's easy to figure out that it's just another realization of Maxwell's daemon, and a highly inefficient one. To reconstruct the state of a kilogram of a classical gas, one would need something like \(10^{26}\) bits of information – well, a big multiple of that because the velocities would have to be known exponentially accurately. But to reserve the room in your PC memory for this huge inflow of data, you would have to erase a huge number of bits and the entropy would go up dramatically as a result.

Some people still keep on suggesting that Maxwell's daemon could work. A notable example is this 1998 paper by John Earman and John Norton in which they claim that the debunking of Maxwell's daemon is based on the circular reasoning. I find such texts deeply confusing. If you read the paper, they seem to deny the tight relationship between the entropy and information in general. This seems utterly indefensible to me.

In their "circularity" argumentation, they apparently criticize the "Maxwell's daemon won't work" arguments for their making assumptions about the behavior of the entropy of the daemon. Well, because Maxwell – and others – failed to present a precise model of the daemon's inner workings, the people who analyze this gadget must make some assumptions. In the particular models one may construct, it may be seen that the erasure of information does increase the entropy.

But even though it's impossible to "describe all the details" about the processes in a gadget – general Maxwell's daemon – that no one has clearly defined (it should be a task for the proponents of the daemon, shouldn't it?), it's still true that we have proofs that the entropy is increasing that don't seem to depend on the non-existence of computers, memory chips, and other devices at all. This simply means that these proofs have to apply to computers, too. A microprocessor is just another physical object. A memory chip is a physical object, too. And the fact that the entropy can't decrease in the thermodynamic limit may be proved quite generally, so it must apply to computers, too.

At least, without some convincing example showing a loophole, an example that hasn't been debunked yet, it seems silly to me not to consider the second law to be a general fact. Earman and Norton may try to criticize people with my opinion as people defending dogmas but it's a dogma analogous to \(2+2=4\) – it's demonstrably true although one may still be forced to work hard to debunk increasingly sophisticated attempts to prove that \(2+2\neq 4\).

To imagine that some "clever engines" may stand "above" the laws of physics is just a ludicrous religious belief, an elementary fallacy. It's a similar fallacy as the fallacy that the planned economies and their goverments may deny the general laws of economics. They cannot. People and even politicians are physical objects, too. Of course that if one envisions a divine daemon that doesn't have to be subject to the laws of physics, many things become possible. But in my opinion, the dreaming about such divinities doesn't belong to natural science. And because Earman's and Norton's argumentation is so analogous to the claims "science will never be able to prove the non-existence of supernatural beings and scientists are just bigots who have to assume the non-existence of supernatural beings and adjust their rules to defend their belief", I think that this argumentation isn't scientific, either.

Note that I have used this thermodynamics-economics analogy many times in this text. I actually believe that thermodynamics is the discipline of physics that is most analogous to economics. Some left-wing types who hate the free markets love to imagine that economics isn't a science in any sense. Well, it is a science and many of its statistical assertions, especially various inequalities, are closely analogous to various inequalities one may establish in thermodynamics.

0 comments:

Post a Comment