entropy-logo

Journal Browser

Journal Browser

Maxwell’s Demon 2013

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 May 2013) | Viewed by 60533

Special Issue Editor


E-Mail
Guest Editor
Faculty of Philosophy, University of Oxford, UK
Interests: quantum nanoscience; quantum theory; statistical mechanics

Special Issue Information

Dear Colleagues,

Since the earliest days of statistical mechanics, the existence of thermal fluctuations have posed a threat to our understanding of thermodynamics. This threat was vividly captured by Maxwell, who envisaged a nimble and light fingered being, able to systematically exploit and accumulate these fluctuations. With the latest developments in quantum nanotechnology, the manipulation of individual systems becomes a realistic possibility, while a modern consensus seems to have emerged that the being must still fail due to properties of information processing. Assessing the strength of these claims requires addressing many of the key open questions in the foundations of statistical mechanics.

Dr. Owen Marone
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Maxwell's Demon
  • Landauer's Principle
  • thermal fluctuations
  • physics of information
  • quantum information
  • thermodynamic irreversibility

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

191 KiB  
Article
Going Round in Circles: Landauer vs. Norton on the Thermodynamics of Computation
by James Ladyman and Katie Robertson
Entropy 2014, 16(4), 2278-2290; https://doi.org/10.3390/e16042278 - 22 Apr 2014
Cited by 11 | Viewed by 5934
Abstract
There seems to be a consensus among physicists that there is a connection between information processing and thermodynamics. In particular, Landauer’s Principle (LP) is widely assumed as part of the foundation of information theoretic/computational reasoning in diverse areas of physics including cosmology. It [...] Read more.
There seems to be a consensus among physicists that there is a connection between information processing and thermodynamics. In particular, Landauer’s Principle (LP) is widely assumed as part of the foundation of information theoretic/computational reasoning in diverse areas of physics including cosmology. It is also often appealed to in discussions about Maxwell’s demon and the status of the Second Law of Thermodynamics. However, LP has been challenged. In his 2005, Norton argued that LP has not been proved. LPSG offered a new proof of LP. Norton argued that the LPSG proof is unsound and Ladyman and Robertson defended it. However, Norton’s latest work also generalizes his critique to argue for a no go result that he purports to be the end of the thermodynamics of computation. Here we review the dialectic as it currently stands and consider Norton’s no go result. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
258 KiB  
Article
Thermodynamics as Control Theory
by David Wallace
Entropy 2014, 16(2), 699-725; https://doi.org/10.3390/e16020699 - 24 Jan 2014
Cited by 21 | Viewed by 7364
Abstract
I explore the reduction of thermodynamics to statistical mechanics by treating the former as a control theory: A theory of which transitions between states can be induced on a system (assumed to obey some known underlying dynamics) by means of operations from a [...] Read more.
I explore the reduction of thermodynamics to statistical mechanics by treating the former as a control theory: A theory of which transitions between states can be induced on a system (assumed to obey some known underlying dynamics) by means of operations from a fixed list. I recover the results of standard thermodynamics in this framework on the assumption that the available operations do not include measurements which affect subsequent choices of operations. I then relax this assumption and use the framework to consider the vexed questions of Maxwell’s demon and Landauer’s principle. Throughout, I assume rather than prove the basic irreversibility features of statistical mechanics, taking care to distinguish them from the conceptually distinct assumptions of thermodynamics proper. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
1518 KiB  
Communication
Non-Equilibrium Statistical Mechanics Inspired by Modern Information Theory
by Oscar C. O. Dahlsten
Entropy 2013, 15(12), 5346-5361; https://doi.org/10.3390/e15125346 - 3 Dec 2013
Cited by 15 | Viewed by 6920
Abstract
A collection of recent papers revisit how to quantify the relationship between information and work in the light of modern information theory, so-called single-shot information theory. This is an introduction to those papers, from the perspective of the author. Many of the results [...] Read more.
A collection of recent papers revisit how to quantify the relationship between information and work in the light of modern information theory, so-called single-shot information theory. This is an introduction to those papers, from the perspective of the author. Many of the results may be viewed as a quantification of how much work a generalized Maxwell’s daemon can extract as a function of its extra information. These expressions do not in general involve the Shannon/von Neumann entropy but rather quantities from single-shot information theory. In a limit of large systems composed of many identical and independent parts the Shannon/von Neumann entropy is recovered. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Show Figures

Figure 1

284 KiB  
Article
Beyond Landauer Erasure
by Stephen M. Barnett and Joan A. Vaccaro
Entropy 2013, 15(11), 4956-4968; https://doi.org/10.3390/e15114956 - 13 Nov 2013
Cited by 28 | Viewed by 9413
Abstract
In thermodynamics, one considers thermal systems and the maximization of entropy subject to the conservation of energy. A consequence is Landauer’s erasure principle, which states that the erasure of one bit of information requires a minimum energy cost equal to kT ln(2), where [...] Read more.
In thermodynamics, one considers thermal systems and the maximization of entropy subject to the conservation of energy. A consequence is Landauer’s erasure principle, which states that the erasure of one bit of information requires a minimum energy cost equal to kT ln(2), where T is the temperature of a thermal reservoir used in the process and k is Boltzmann’s constant. Jaynes, however, argued that the maximum entropy principle could be applied to any number of conserved quantities, which would suggest that information erasure may have alternative costs. Indeed, we showed recently that by using a reservoir comprising energy degenerate spins and subject to conservation of angular momentum, the cost of information erasure is in terms of angular momentum rather than energy. Here, we extend this analysis and derive the minimum cost of information erasure for systems where different conservation laws operate. We find that, for each conserved quantity, the minimum resource needed to erase one bit of memory is λ-1 ln(2), where λ is related to the average value of the conserved quantity. The costs of erasure depend, fundamentally, on both the nature of the physical memory element and the reservoir with which it is coupled. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Show Figures

Graphical abstract

758 KiB  
Article
All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation
by John D. Norton
Entropy 2013, 15(10), 4432-4483; https://doi.org/10.3390/e15104432 - 17 Oct 2013
Cited by 36 | Viewed by 11689
Abstract
The most successful exorcism of Maxwell’s demon is Smoluchowski’s 1912 observation that thermal fluctuations would likely disrupt the operation of any molecular-scale demonic machine. A later tradition sought to exorcise Maxwell’s demon by assessing the entropic cost of the demon’s processing of information. [...] Read more.
The most successful exorcism of Maxwell’s demon is Smoluchowski’s 1912 observation that thermal fluctuations would likely disrupt the operation of any molecular-scale demonic machine. A later tradition sought to exorcise Maxwell’s demon by assessing the entropic cost of the demon’s processing of information. This later tradition fails since these same thermal fluctuations invalidate the molecular-scale manipulations upon which the thermodynamics of computation is based. A new argument concerning conservation of phase space volume shows that all Maxwell’s demons must fail. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Show Figures

Figure 1

300 KiB  
Article
Conditioning, Correlation and Entropy Generation in Maxwell’s Demon
by Neal G. Anderson
Entropy 2013, 15(10), 4243-4265; https://doi.org/10.3390/e15104243 - 9 Oct 2013
Cited by 2 | Viewed by 5810
Abstract
Maxwell’s Demon conspires to use information about the state of a confined molecule in a Szilard engine (randomly frozen into a state subspace by his own actions) to derive work from a single-temperature heat bath. It is widely accepted that, if the Demon [...] Read more.
Maxwell’s Demon conspires to use information about the state of a confined molecule in a Szilard engine (randomly frozen into a state subspace by his own actions) to derive work from a single-temperature heat bath. It is widely accepted that, if the Demon can achieve this at all, he can do so without violating the Second Law only because of a counterbalancing price that must be paid to erase information when the Demon’s memory is reset at the end of his operating cycle. In this paper, Maxwell’s Demon is analyzed within a “referential” approach to physical information that defines and quantifies the Demon’s information via correlations between the joint physical state of the confined molecule and that of the Demon’s memory. On this view, which received early emphasis in Fahn’s 1996 classical analysis of Maxwell’s Demon, information is erased not during the memory reset step of the Demon’s cycle, but rather during the expansion step, when these correlations are destroyed. Dissipation and work extraction are analyzed here for a Demon that operates a generalized quantum mechanical Szilard engine embedded in a globally closed composite, which also includes a work reservoir, a heat bath and the remainder of the Demon’s environment. Memory-engine correlations lost during the expansion step, which enable extraction of work from the Demon via operations conditioned on the memory contents, are shown to be dissipative when this decorrelation is achieved unconditionally so no work can be extracted. Fahn’s essential conclusions are upheld in generalized form, and his quantitative results supported via appropriate specialization to the Demon of his classical analysis, all without external appeal to classical thermodynamics, the Second Law, phase space conservation arguments or Landauer’s Principle. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Show Figures

Figure 1

274 KiB  
Article
Entropy and Computation: The Landauer-Bennett Thesis Reexamined
by Meir Hemmo and Orly Shenker
Entropy 2013, 15(8), 3297-3311; https://doi.org/10.3390/e15083297 - 21 Aug 2013
Cited by 13 | Viewed by 6541
Abstract
The so-called Landauer-Bennett thesis says that logically irreversible operations (physically implemented) such as erasure necessarily involve dissipation by at least kln2 per bit of lost information. We identify the physical conditions that are necessary and sufficient for erasure and show that the [...] Read more.
The so-called Landauer-Bennett thesis says that logically irreversible operations (physically implemented) such as erasure necessarily involve dissipation by at least kln2 per bit of lost information. We identify the physical conditions that are necessary and sufficient for erasure and show that the thesis does not follow from the principles of classical mechanics. In particular, we show that even if one assumes that information processing is constrained by the laws of classical mechanics, it need not be constrained by the Second Law of thermodynamics. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Show Figures

Figure 1

730 KiB  
Article
The Demon in a Vacuum Tube
by Germano D'Abramo
Entropy 2013, 15(5), 1916-1928; https://doi.org/10.3390/e15051916 - 21 May 2013
Cited by 2 | Viewed by 5958
Abstract
In the present paper, several issues concerning the second law of thermodynamics, Maxwell’s demon and Landauer’s principle are dealt with. I argue that if the demon and the system on which it operates without dissipation of external energy are made of atoms and [...] Read more.
In the present paper, several issues concerning the second law of thermodynamics, Maxwell’s demon and Landauer’s principle are dealt with. I argue that if the demon and the system on which it operates without dissipation of external energy are made of atoms and molecules (gas, liquid or solid) in thermal equilibrium (whose behaviour is described by a canonical distribution), then the unavoidable reason why the demon cannot successfully operate resides in the ubiquity of thermal fluctuations and friction. Landauer’s principle appears to be unnecessary. I also suggest that if the behaviour of the demon and the system on which it acts is not always describable by a canonical distribution, as would happen for instance with the ballistic motion of electrons at early stages of thermionic emission, then a successful working demon cannot be ruled out a priori. A critical review of two recent experiments on thermionic emission Maxwell’s demons is also given. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Show Figures

Figure 1

Back to TopTop