Eaters of the Lotus: Landauer’s Principle and the Return of Maxwell’s Demon



Download 275.5 Kb.
Page1/10
Date28.05.2018
Size275.5 Kb.
  1   2   3   4   5   6   7   8   9   10
0DRAFT

Revision April22, April 21, 04; March 29, 22, 2004



Eaters of the Lotus: Landauer’s Principle

and the Return of Maxwell’s Demon


John D. Norton1

Department of History and Philosophy of Science

University of Pittsburgh

Pittsburgh PA 15260



jdnorton@pitt.edu
Landauer’s principle is the loosely formulated notion that the erasure of n bits of information must always incur a cost of k ln n in thermodynamic entropy. It can be formulated as a precise result in statistical mechanics, but by erasure processes that use a thermodynamically irreversible phase space expansion, which is the real origin of the law’s entropy cost. General arguments that purport to establish the unconditional validity of the law (erasure maps many physical states to one; erasure compresses the phase space) fail. They turn out to depend on the illicit formation of a canonical ensemble from memory devices holding random data. To exorcise Maxwell’s demon one must show that all candidate devices—the ordinary and the extraordinary—must fail to reverse the second law of thermodynamics. The theorizing surrounding Landauer’s principle is too fragile and too tied to a few specific examples to support such general exorcism. Charles Bennett has recently extended Landauer’s principle in order to exorcise a no erasure demon proposed by John Earman and me. The extension fails for the same reasons as trouble the original principle.

1. Introduction


A sizeable literature is based on the claim that Maxwell’s demon must fail to produce violations of the second law of thermodynamics because of an inevitable entropy cost associated with certain types of information processing. In the second edition of their standard compilation of work on Maxwell’s demon, Leff and Rex (2003, p. xii) note that more references have been generated in the 13 years since the volume’s first edition than in all years prior to it, extending back over the demon’s 120 years of life. A casual review of the literature gives the impression that the demonstrations of the failure of Maxwell’s demon depend on the discovery of independent principles concerning the entropy cost of information processing. It looks like a nice example of new discoveries explaining old anomalies. Yet closer inspection suggests that something is seriously amiss. There seems to be no independent basis for the new principles. In typical analyses, it is assumed at the outset that the total system has canonical thermal properties so that the second law will be preserved; and the analysis then infers back from that assumption to the entropy costs that it assumes must arise in information processing. In our Earman and Norton (1998/99), my colleague John Earman and I encapsulated this concern in a dilemma posed for all proponents of information theoretic exorcisms of Maxwell’s demon. Either the combined object system and demon are assumed to form a canonical thermal system or they are not. If not (“profound” horn), then we ask proponents of information theoretic exorcisms to supply the new physical principle needed to assure failure of the demon and give independent grounds for it. Otherwise (“sound” horn), it is clear that the demon will fail; but it will fail only because its failure has been assumed at the outset. Then the exorcism merely argues to a foregone conclusion.

Charles Bennett has been one of the most influential proponents of information theoretic exorcisms of Maxwell’s demon. The version he supports seems now to be standard. It urges that a Maxwell demon must at some point in its operation erase information. It then invokes Landauer’s principle, which attributes an entropy cost of at least k ln n to the erasure of n bits of information in a memory device, to supply the missing entropy needed to save the second law. (k is Boltzmann’s constant.) We are grateful for Bennett’s (2003, p. 501, 508-10) candor in responding directly to our dilemma and accepting its sound horn.2 He acknowledges that his use of Landauer’s principle is “in a sense…indeed a straightforward consequence or restatement of the Second Law, [but] it still has considerable pedagogic and explanatory power…” While some hidden entropy cost can be inferred from the presumed correctness of the second law, its location remains open. The power of Landauer’s principle, Bennett asserts, resides in locating this cost properly in information erasure and so correcting an earlier literature that mislocated it in information acquisition.

My concern in this paper is to look more closely at Landauer’s principle and how it is used to exorcise Maxwell’s demon. My conclusion will be that this literature overreaches. Its basic principles are vaguely formulated; and its justifications are rudimentary and sometimes dependent on misapplications of statistical mechanics. It is a foundation too weak and fragile to support a result as general as the impossibility of Maxwell’s demon. That is, I will seek to establish the following:

• The loose notion that erasing a bit of information increases the thermodynamic entropy of the environment by at least k ln 2 can be made precise as a definite result in statistical mechanics. The result depends essentially, however, on the use of a particular erasure procedure, in which there is a thermodynamically irreversible expansion of the memory device’s phase space. The real origin of the erasure’s entropy cost lies in the thermodynamic entropy created by this irreversible step.

• The literature on Landauer’s principle contains an enduring misapplication of statistical mechanics. A collection of memory devices recording different data is illicitly assembled and treated in various ways as if it were a canonical ensemble. The outcome is that a collection of memory devices holding random data is mistakenly said to have greater entropy and to occupy more phase space than the same memory devices all recording the same default data.

• The argument given in favor of the unconditional applicability of Landauer’s principle is that erasure maps many physical states onto one and that this mapping is a compression of the memory device phase space. The argument fails. It depends on the incorrect assumption that memory devices holding random data occupy a greater volume in phase space and have greater entropy than when the devices have been reset to default data. This incorrect assumption in turn depends upon the illicit formation of canonical ensembles mentioned.

• A compression of the phase space may arise in an erasure process, but only if the compression is preceded by a corresponding expansion. In practical erasure processes, this expansion is thermodynamically irreversible and the real origin of the erasure’s entropy cost. The literature on Landauer’s principle has yet to demonstrate that this expansion must be thermodynamically irreversible in all admissible erasure processes.

• The challenge of exorcising Maxwell’s demon is to show that no device, no matter how extraordinary or how ingeniously or intricately contrived, can find a way of accumulating fluctuation phenomena into a macroscopic violation of the second law of thermodynamics. The existing analyses of Landauer’s principle are too weak to support such a strong result. The claims to the contrary depend on displaying a few suggestive examples in which the demon fails and expecting that every other possible attempt at a Maxwell demon, no matter how extraordinary, must fare likewise. I argue that there is no foundation for this expectation by looking at many ways in which extraordinary Maxwell’s demons might differ from the ordinary examples.

• John Earman and I (1998/99, II pp. 16-17) have described how a Maxwell’s demon may be programmed to operate without erasure. In response, Charles Bennett (2003) has devised an extended version of Landauer’s principle that also attributes a thermodynamic entropy cost to the merging of computational paths in an effort to block this no erasure demon. The extended version fails, again because it depends upon the illicit formation of canonical ensembles.

In the sections to follow, the precise but restricted version of Landauer’s principle is developed and stated in Section 2, along with some thermodynamic and statistical mechanical preliminaries, introduced for later reference. Section 3 identifies how canonical ensembles are illicitly assembled in the Landauer’s principle literature and shows how this illicit assembly leads to the failure of the many to one mapping argument. Section 4 reviews the challenge presented by Maxwell’s demon and argues that the present literature on Landauer’s principle is too fragile to support its exorcism. Section 5 reviews Bennett’s extension of Landauer’s principle and argues that it fails to exorcise Earman and my no erasure demon.





Share with your friends:
  1   2   3   4   5   6   7   8   9   10


The database is protected by copyright ©dentisty.org 2019
send message

    Main page