Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 11, Issue 4 (December 2009), Pages 529-1147

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-38
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle The Quantum Noise of Ferromagnetic π-Bloch Domain Walls
Entropy 2009, 11(4), 548-559; doi:10.3390/e11040548
Received: 27 August 2009 / Accepted: 23 September 2009 / Published: 28 September 2009
Cited by 1 | PDF Full-text (219 KB) | HTML Full-text | XML Full-text
Abstract
We quantify the probability per unit Euclidean-time of reversing the magnetization of a π-Bloch vector, which describes the Ferromagnetic Domain Walls of a Ferromagnetic Nanowire at finite-temperatures. Our approach, based on Langer’s Theory, treats the double sine-Gordon model that defines the π-Bloch vectors
[...] Read more.
We quantify the probability per unit Euclidean-time of reversing the magnetization of a π-Bloch vector, which describes the Ferromagnetic Domain Walls of a Ferromagnetic Nanowire at finite-temperatures. Our approach, based on Langer’s Theory, treats the double sine-Gordon model that defines the π-Bloch vectors via a procedure of nonperturbative renormalization, and uses importance sampling methods to minimise the free energy of the system and identify the saddlepoint solution corresponding to the reversal probability. We identify that whilst the general solution for the free energy minima cannot be expressed in closed form, we can obtain a closed expression for the saddlepoint by maximizing the entanglement entropy of the system as a polynomial ring. We use this approach to quantify the geometric and non-geometric contributions to the entanglement entropy of the Ferromagnetic Nanowire, defined between entangled Ferromagnetic Domain Walls, and evaluate the Euclidean-time dependence of the domain wall width and angular momentum transfer at the domain walls, which has been recently proposed as a mechanism for Quantum Memory Storage. Full article
Open AccessArticle An Entropy-Like Estimator for Robust Parameter Identification
Entropy 2009, 11(4), 560-585; doi:10.3390/e11040560
Received: 7 September 2009 / Accepted: 23 September 2009 / Published: 12 October 2009
Cited by 12 | PDF Full-text (1393 KB) | HTML Full-text | XML Full-text
Abstract
This paper describes the basic ideas behind a novel prediction error parameter identification algorithm exhibiting high robustness with respect to outlying data. Given the low sensitivity to outliers, these can be more easily identified by analysing the residuals of the fit. The devised
[...] Read more.
This paper describes the basic ideas behind a novel prediction error parameter identification algorithm exhibiting high robustness with respect to outlying data. Given the low sensitivity to outliers, these can be more easily identified by analysing the residuals of the fit. The devised cost function is inspired by the definition of entropy, although the method in itself does not exploit the stochastic meaning of entropy in its usual sense. After describing the most common alternative approaches for robust identification, the novel method is presented together with numerical examples for validation. Full article
Open AccessArticle Landauer’s Principle and Divergenceless Dynamical Systems
Entropy 2009, 11(4), 586-597; doi:10.3390/e11040586
Received: 17 August 2009 / Accepted: 15 September 2009 / Published: 13 October 2009
Cited by 4 | PDF Full-text (175 KB) | HTML Full-text | XML Full-text
Abstract
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information
[...] Read more.
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information is erased from a computing device. Here we explore an extended Landauerlike principle valid for general dynamical systems (not necessarily Hamiltonian) governed by divergenceless phase space flows. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle An Application of Entropy in Survey Scale
Entropy 2009, 11(4), 598-605; doi:10.3390/e11040598
Received: 7 July 2009 / Accepted: 11 October 2009 / Published: 14 October 2009
Cited by 1 | PDF Full-text (165 KB) | HTML Full-text | XML Full-text
Abstract
This study demonstrates an application of entropy for information theory in the field of survey scale. Based on computer anxiety scale we obtain that the desired information may be achieved with fewer questions. In particular, one question is insufficient and two questions are
[...] Read more.
This study demonstrates an application of entropy for information theory in the field of survey scale. Based on computer anxiety scale we obtain that the desired information may be achieved with fewer questions. In particular, one question is insufficient and two questions are necessary for a survey subscale. Full article
Open AccessArticle Economies Evolve by Energy Dispersal
Entropy 2009, 11(4), 606-633; doi:10.3390/e11040606
Received: 17 September 2009 / Accepted: 14 October 2009 / Published: 21 October 2009
Cited by 35 | PDF Full-text (626 KB) | HTML Full-text | XML Full-text | Correction | Supplementary Files
Abstract
Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way
[...] Read more.
Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way as to tend to equalize energy density differences within the economy and in respect to the surroundings it is open to. Diverse economic activities result in flows of energy that will preferentially channel along the most steeply descending paths, leveling a non-Euclidean free energy landscape. This principle of 'maximal energy dispersal‘, equivalent to the maximal rate of entropy production, gives rise to economic laws and regularities. The law of diminishing returns follows from the diminishing free energy while the relation between supply and demand displays a quest for a balance among interdependent energy densities. Economic evolution is dissipative motion where the driving forces and energy flows are inseparable from each other. When there are multiple degrees of freedom, economic growth and decline are inherently impossible to forecast in detail. Namely, trajectories of an evolving economy are non-integrable, i.e. unpredictable in detail because a decision by a player will affect also future decisions of other players. We propose that decision making is ultimately about choosing from various actions those that would reduce most effectively subjectively perceived energy gradients. Full article
Open AccessArticle A Lower-Bound for the Maximin Redundancy in Pattern Coding
Entropy 2009, 11(4), 634-642; doi:10.3390/e11040634
Received: 1 September 2009 / Accepted: 20 October 2009 / Published: 22 October 2009
Cited by 5 | PDF Full-text (143 KB) | HTML Full-text | XML Full-text
Abstract
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known
[...] Read more.
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds. The pattern of a string is obtained by replacing each symbol by the index of its first occurrence. The problem of pattern coding is of interest because strongly universal codes have been proved to exist for patterns while universal message coding is impossible for memoryless sources on an infinite alphabet. The proof uses fine combinatorial results on partitions with small summands. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle The Influence of Shape on Parallel Self-Assembly
Entropy 2009, 11(4), 643-666; doi:10.3390/e11040643
Received: 16 September 2009 / Accepted: 20 October 2009 / Published: 23 October 2009
Cited by 8 | PDF Full-text (2634 KB)
Abstract
Self-assembly is a key phenomenon whereby vast numbers of individual components passively interact and form organized structures, as can be seen, for example, in the morphogenesis of a virus. Generally speaking, the process can be viewed as a spatial placement of attractive and
[...] Read more.
Self-assembly is a key phenomenon whereby vast numbers of individual components passively interact and form organized structures, as can be seen, for example, in the morphogenesis of a virus. Generally speaking, the process can be viewed as a spatial placement of attractive and repulsive components. In this paper, we report on an investigation of how morphology, i.e., the shape of components, affects a self-assembly process. The experiments were conducted with 3 differently shaped floating tiles equipped with magnets in an agitated water tank. We propose a novel measure involving clustering coefficients, which qualifies the degree of parallelism of the assembly process. The results showed that the assembly processes were affected by the aggregation sequence in their early stages, where shape induces different behaviors and thus results in variations in aggregation speeds. Full article
Figures

Open AccessArticle Configurational Entropy in Chiral Solutions—Negative Entropy of Solvent Envelopes
Entropy 2009, 11(4), 667-674; doi:10.3390/e11040667
Received: 2 September 2009 / Accepted: 26 October 2009 / Published: 29 October 2009
Cited by 3 | PDF Full-text (423 KB) | HTML Full-text | XML Full-text
Abstract
A homogeneous solution of a chiral substance is acquired with an overall asymmetry which is expressed by a specific rotation of a linearly polarized light. Such a solution, despite being at a complete equilibrium, stores configurational entropy in a form of negative entropy
[...] Read more.
A homogeneous solution of a chiral substance is acquired with an overall asymmetry which is expressed by a specific rotation of a linearly polarized light. Such a solution, despite being at a complete equilibrium, stores configurational entropy in a form of negative entropy which can be nullified by mixing with a solution of the opposite enantiomer. This abundant, yet quite a specific case of inherent negative entropy, resides predominantly in the chiral configuration of the solvent envelopes surrounding the chiral centers. Heat release, amounting to several cal/mol, associated with the annulment of negative entropy in aqueous solutions of D- and L-amino acids, was recently documented by Shinitzky et al. [1]. This heat corresponds almost exclusively to TΔS stored in the solvent envelope upon adoption of a chiral configuration. Simple fundamental expressions which combine configurational entropy and information capacity in chiral solutions have been developed and were found to comply well with the observed heat release upon intermolecular racemization. Full article
(This article belongs to the Special Issue Configurational Entropy)
Open AccessArticle The Maximum Entropy Rate Description of a Thermodynamic System in a Stationary Non-Equilibrium State
Entropy 2009, 11(4), 675-687; doi:10.3390/e11040675
Received: 14 September 2009 / Accepted: 27 October 2009 / Published: 29 October 2009
Cited by 3 | PDF Full-text (164 KB) | HTML Full-text | XML Full-text
Abstract
In this paper we present a simple model to describe a rather general system in a stationary non-equilibrium state, which is an open system traversed by a stationary flux. The probabilistic description is provided by a non-homogeneous Markov chain, which is not assumed
[...] Read more.
In this paper we present a simple model to describe a rather general system in a stationary non-equilibrium state, which is an open system traversed by a stationary flux. The probabilistic description is provided by a non-homogeneous Markov chain, which is not assumed on the basis of a model of the microscopic interactions but rather derived from the knowledge of the macroscopic fluxes traversing the system through a maximum entropy rate principle. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle A Law of Word Meaning in Dolphin Whistle Types
Entropy 2009, 11(4), 688-701; doi:10.3390/e11040688
Received: 30 June 2009 / Accepted: 26 October 2009 / Published: 30 October 2009
Cited by 11 | PDF Full-text (233 KB) | HTML Full-text | XML Full-text
Abstract
We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a
[...] Read more.
We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a whistle tends to occur or not occur is shared by different individuals, which is consistent with the hypothesis that dolphins are communicating through whistles. Furthermore, we show that the number of behavioral contexts significantly associated with a certain whistle type tends to grow with the frequency of the whistle type, a pattern that is reminiscent of a law of word meanings stating, as a tendency, that the higher the frequency of a word, the higher its number of meanings. Our findings indicate that the presence of Zipf's law in dolphin whistle types cannot be explained with enough detail by a simplistic die rolling experiment. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle Determination of the Real Loss of Power for a Condensing and a Backpressure Turbine by Means of Second Law Analysis
Entropy 2009, 11(4), 702-712; doi:10.3390/e11040702
Received: 19 August 2009 / Accepted: 27 October 2009 / Published: 30 October 2009
Cited by 11 | PDF Full-text (216 KB) | HTML Full-text | XML Full-text
Abstract
All real processes generate entropy and the power/exergy loss is usually determined by means of the Gouy-Stodola law. If the system only exchanges heat at the environmental temperature, the Gouy-Stodola law gives the correct loss of power. However, most industrial processes exchange heat
[...] Read more.
All real processes generate entropy and the power/exergy loss is usually determined by means of the Gouy-Stodola law. If the system only exchanges heat at the environmental temperature, the Gouy-Stodola law gives the correct loss of power. However, most industrial processes exchange heat at higher or lower temperatures than the actual environmental temperature. When calculating the real loss of power in these cases, the Gouy-Stodola law does not give the correct loss if the actual environmental temperature is used. The first aim of this paper is to show through simple steam turbine examples that the previous statement is true. The second aim of the paper is to define the effective temperature to calculate the real power loss of the system with the Gouy-Stodola law, and to apply it to turbine examples. Example calculations also show that the correct power loss can be defined if the effective temperature is used instead of the real environmental temperature. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Transport of Heat and Charge in Electromagnetic Metrology Based on Nonequilibrium Statistical Mechanics
Entropy 2009, 11(4), 748-765; doi:10.3390/e11040748
Received: 11 September 2009 / Accepted: 26 October 2009 / Published: 3 November 2009
PDF Full-text (194 KB) | HTML Full-text | XML Full-text
Abstract
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to
[...] Read more.
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to use a nonequilibrium statistical-mechanical method to obtain exact time-correlation functions, fluctuation-dissipation theorems (FD), heat and charge transport, and associated transport expressions under electromagnetic driving. We extend the time-symmetric Robertson statistical-mechanical theory to study the exact time evolution of relevant variables and entropy rate in the electromagnetic interaction with materials. In this exact statistical-mechanical theory, a generalized canonical density is used to define an entropy in terms of a set of relevant variables and associated Lagrange multipliers. Then the entropy production rate are defined through the relevant variables. The influence of the nonrelevant variables enter the equations through the projection-like operator and thereby influences the entropy. We present applications to the response functions for the electrical and thermal conductivity, specific heat, generalized temperature, Boltzmann’s constant, and noise. The analysis can be performed either classically or quantum-mechanically, and there are only a few modifications in transferring between the approaches. As an application we study the energy, generalized temperature, and charge transport equations that are valid in nonequilibrium and relate it to heat flow and temperature relations in equilibrium states. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle What is Fair Pay for Executives? An Information Theoretic Analysis of Wage Distributions
Entropy 2009, 11(4), 766-781; doi:10.3390/e11040766
Received: 31 August 2009 / Accepted: 26 October 2009 / Published: 3 November 2009
Cited by 5 | PDF Full-text (222 KB) | HTML Full-text | XML Full-text
Abstract
The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We
[...] Read more.
The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We use the principle of maximum entropy to show that the maximally fair pay distribution is lognormal under ideal conditions. This prediction is in agreement with observed data for the bottom 90%–95% of the working population. The theory estimates that the top 35 U.S. CEOs were overpaid by about 129 times their ideal salaries in 2008. We also provide an insight of entropy as a measure of fairness, which is maximized at equilibrium, in an economic system. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle The Languages of Neurons: An Analysis of Coding Mechanisms by Which Neurons Communicate, Learn and Store Information
Entropy 2009, 11(4), 782-797; doi:10.3390/e11040782
Received: 17 August 2009 / Accepted: 30 October 2009 / Published: 4 November 2009
Cited by 6 | PDF Full-text (241 KB) | HTML Full-text | XML Full-text
Abstract
In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron
[...] Read more.
In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron uses a simple mechanism for transmitting information. This is in the form of temporal electrophysiological action potentials or spikes (S) operating on a millisecond timescale that, along with pauses (P) between spikes constitute a two letter “alphabet” that generates meaningful frequency-encoded signals or neuronal S/P “words” in a primary language. However, when a word from an afferent neuron enters the dendritic-synaptic-dendritic field between two neurons, it is translated into a new frequency-encoded word with the same meaning, but in a different spike-pause language, that is delivered to and understood by the efferent neuron. It is suggested that this unidirectional inter-neuronal language-based word translation step is of utmost importance to brain function in that it allows for variations in meaning to occur. Thus, structural or biochemical changes in dendrites or synapses can produce novel words in the second language that have changed meanings, allowing for a specific signaling experience, either external or internal, to modify the meaning of an original word (learning), and store the learned information of that experience (memory) in the form of an altered dendritic-synaptic-dendritic field. Full article
Figures

Open AccessArticle Exergy as a Useful Variable for Quickly Assessing the Theoretical Maximum Power of Salinity Gradient Energy Systems
Entropy 2009, 11(4), 798-806; doi:10.3390/e11040798
Received: 14 September 2009 / Accepted: 3 November 2009 / Published: 5 November 2009
Cited by 3 | PDF Full-text (219 KB) | HTML Full-text | XML Full-text
Abstract
It is known that mechanical work, and in turn electricity, can be produced from a difference in the chemical potential that may result from a salinity gradient. Such a gradient may be found, for instance, in an estuary where a stream of soft
[...] Read more.
It is known that mechanical work, and in turn electricity, can be produced from a difference in the chemical potential that may result from a salinity gradient. Such a gradient may be found, for instance, in an estuary where a stream of soft water is flooding into a sink of salty water which we may find in an ocean, gulf or salt lake. Various technological approaches are proposed for the production of energy from a salinity gradient between a stream of soft water and a source of salty water. Before considering the implementation of a typical technology, it is of utmost importance to be able to compare various technological approaches, on the same basis, using the appropriate variables and mathematical formulations. In this context, exergy balance can become a very useful tool for an easy and quick evaluation of the maximum thermodynamic work that can be produced from energy systems. In this short paper, we briefly introduce the use of exergy for enabling us to easily and quickly assess the theoretical maximum power or ideal reversible work we may expect from typical salinity gradient energy systems. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Statistical Ensemble Theory of Gompertz Growth Model
Entropy 2009, 11(4), 807-819; doi:10.3390/e11040807
Received: 1 August 2009 / Accepted: 2 November 2009 / Published: 5 November 2009
Cited by 3 | PDF Full-text (1282 KB) | HTML Full-text | XML Full-text
Abstract
An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein
[...] Read more.
An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein statistics picture. The analytical entropy expression pertain to the law can be obtained in terms of the growth velocity distribution as well as the Gompertz function itself for the whole process. Full article
Open AccessArticle Using Exergy to Understand and Improve the Efficiency of Electrical Power Technologies
Entropy 2009, 11(4), 820-835; doi:10.3390/e11040820
Received: 21 September 2009 / Accepted: 2 November 2009 / Published: 6 November 2009
Cited by 27 | PDF Full-text (294 KB) | HTML Full-text | XML Full-text
Abstract
The benefits are demonstrated of using exergy to understand the efficiencies of electrical power technologies and to assist improvements. Although exergy applications in power systems and electrical technology are uncommon, exergy nevertheless identifies clearly potential reductions in thermodynamic losses and efficiency improvements. Various
[...] Read more.
The benefits are demonstrated of using exergy to understand the efficiencies of electrical power technologies and to assist improvements. Although exergy applications in power systems and electrical technology are uncommon, exergy nevertheless identifies clearly potential reductions in thermodynamic losses and efficiency improvements. Various devices are considered, ranging from simple electrical devices to generation systems for electrical power and for multiple products including electricity, and on to electrically driven. The insights provided by exergy are shown to be more useful than those provided by energy, which are sometimes misleading. Exergy is concluded to have a significant role in assessing and improving the efficiencies of electrical power technologies and systems, and provides a useful tool for engineers and scientists as well as decision and policy makers. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains
Entropy 2009, 11(4), 867-887; doi:10.3390/e11040867
Received: 21 September 2009 / Accepted: 10 November 2009 / Published: 17 November 2009
Cited by 5 | PDF Full-text (210 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to
[...] Read more.
In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle On the Structural Non-identifiability of Flexible Branched Polymers
Entropy 2009, 11(4), 907-916; doi:10.3390/e11040907
Received: 21 October 2009 / Accepted: 18 November 2009 / Published: 20 November 2009
Cited by 7 | PDF Full-text (235 KB) | HTML Full-text | XML Full-text
Abstract
The dynamics and statics of flexible polymer chains are based on their conformational entropy, resulting in the properties of isolated polymer chains with any branching potentially being characterized by Gaussian chain models. According to the graph-theoretical approach, the dynamics and statics of Gaussian
[...] Read more.
The dynamics and statics of flexible polymer chains are based on their conformational entropy, resulting in the properties of isolated polymer chains with any branching potentially being characterized by Gaussian chain models. According to the graph-theoretical approach, the dynamics and statics of Gaussian chains can be expressed as a set of eigenvalues of their Laplacian matrix. As such, the existence of Laplacian cospectral trees allows the structural nonidentifiability of any branched flexible polymer. Full article
(This article belongs to the Special Issue Entropies of Polymers)
Open AccessArticle A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight
Entropy 2009, 11(4), 917-930; doi:10.3390/e11040917
Received: 24 September 2009 / Accepted: 16 November 2009 / Published: 26 November 2009
Cited by 4 | PDF Full-text (189 KB) | HTML Full-text | XML Full-text
Abstract
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This
[...] Read more.
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This method can be generalized to the weighted GME (W-GME), where different weights are assigned to the two entropies in the objective function. We propose a data-driven method to select the weights in the entropy objective function. We use the least squares cross validation to derive the optimal weights. MonteCarlo simulations demonstrate that the proposedW-GME estimator is comparable to and often outperforms the conventional GME estimator, which places equal weights on the entropies of coefficient and disturbance distributions. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle Maximum Entropy Production as an Inference Algorithm that Translates Physical Assumptions into Macroscopic Predictions: Don’t Shoot the Messenger
Entropy 2009, 11(4), 931-944; doi:10.3390/e11040931
Received: 23 October 2009 / Accepted: 23 November 2009 / Published: 27 November 2009
Cited by 42 | PDF Full-text (268 KB) | HTML Full-text | XML Full-text
Abstract
Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium
[...] Read more.
Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium systems. MaxEnt itself has no physical content; disagreement between MaxEnt predictions and experiment falsifies the physical assumptions, not MaxEnt. While it remains to be shown rigorously that MEP is indeed equivalent to MaxEnt for systems arbitrarily far from equilibrium, work in progress tentatively supports this conclusion. In terms of its role within non-equilibrium statistical mechanics, MEP might then be better understood as Messenger of Essential Physics. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Open AccessArticle A Story and a Recommendation about the Principle of Maximum Entropy Production
Entropy 2009, 11(4), 945-948; doi:10.3390/e11040945
Received: 23 October 2009 / Accepted: 26 November 2009 / Published: 30 November 2009
Cited by 6 | PDF Full-text (70 KB) | HTML Full-text | XML Full-text
Abstract
The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at
[...] Read more.
The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at least led to re-consideration of the very practical issue of water-vapour feedback in climate change. Further, and on a more-or-less unrelated matter, a recommendation is made for further research on whether there might exist a general "rule" whereby, for certain classes of complex non-linear systems, a state of maximum entropy production is equivalent to a state of minimum entropy. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Open AccessArticle Calculation of Entropy for a Sinusoid with Beta-Distributed Phase
Entropy 2009, 11(4), 949-958; doi:10.3390/e11040949
Received: 17 September 2009 / Accepted: 13 November 2009 / Published: 2 December 2009
PDF Full-text (256 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression
[...] Read more.
In this paper, an analytical expression is developed for the differential entropy of a sinusoid with a Beta-distributed phase angle. This particular signal model is prevalent in optical communications, however an expression for the associated differential entropy does not currently exist. The expression we derive is approximate as it relies on a series expansion for one of the key terms needed in the derivation. However, we are able to show that the approximation is accurate (error ≤ 5%) for a wide variety of Beta parameter choices. Full article
Open AccessArticle Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
Entropy 2009, 11(4), 959-971; doi:10.3390/e11040959
Received: 3 November 2009 / Accepted: 30 November 2009 / Published: 2 December 2009
Cited by 4 | PDF Full-text (187 KB) | HTML Full-text | XML Full-text
Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out.
[...] Read more.
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessCommunication Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
Entropy 2009, 11(4), 993-1000; doi:10.3390/e11040993
Received: 15 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
Cited by 6 | PDF Full-text (178 KB) | HTML Full-text | XML Full-text
Abstract
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from
[...] Read more.
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from the coordinated behavior of their parts. Coordination in turn necessitates recognition between parts, i.e., information exchange. What will be argued here is that the scope of recognition processes between parts is increased when preceded by their dispersal, which multiplies the number of encounters and creates a richer potential for recognition. A process intrinsic to emergence is dissolvence (aka submergence or top-down constraints), which participates in the information-entropy interplay underlying the creation, evolution and breakdown of higher-level entities. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Best Probability Density Function for Random Sampled Data
Entropy 2009, 11(4), 1001-1024; doi:10.3390/e11041001
Received: 9 October 2009 / Accepted: 2 December 2009 / Published: 4 December 2009
PDF Full-text (1356 KB) | HTML Full-text | XML Full-text
Abstract
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments
[...] Read more.
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments are generally unstable in the presence of noise due to limited sampling. A robust method is presented that always returns the best pdf, where tradeoff in smoothing a highly varying function due to noise can be controlled. An unconventional adaptive simulated annealing technique, called funnel diffusion, determines expansion coefficients for Chebyshev polynomials in the exponential function. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle On the Spectral Entropy of Thermodynamic Paths for Elementary Systems
Entropy 2009, 11(4), 1025-1041; doi:10.3390/e11041025
Received: 13 October 2009 / Accepted: 27 November 2009 / Published: 7 December 2009
PDF Full-text (425 KB) | HTML Full-text | XML Full-text
Abstract
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on
[...] Read more.
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on the information expressed in thermodynamic pathways. Examined here is how spectral entropy is a by-product of information that depends intricately on the pathway structure. The spectral entropy has proven to be a valuable tool in diverse fields. This paper illustrates the contact between spectral entropy and the properties which distinguish ideal from non-ideal gases. The role of spectral entropy in the first and second laws of thermodynamics and heat → work conversions is also discussed. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Modeling Electric Discharges with Entropy Production Rate Principles
Entropy 2009, 11(4), 1042-1054; doi:10.3390/e11041042
Received: 29 October 2009 / Accepted: 1 December 2009 / Published: 8 December 2009
Cited by 13 | PDF Full-text (210 KB) | HTML Full-text | XML Full-text
Abstract
Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate
[...] Read more.
Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate principles are related to the maximum entropy production rate principle (MEPP). Secondly, three typical examples are discussed, which provide a certain insight in the structure of the models that are candidates for MEPP application. It is then thirdly argued that MEPP, although not being an exact physical law, may provide reasonable model parameter estimates, provided the constraints contain the relevant (nonlinear) physical effects and the parameters to be determined are related to disregarded weak constraints that affect mainly global entropy production. Finally, it is additionally conjectured that a further reason for the success of MEPP in certain far from equilibrium systems might be based on a hidden linearity of the underlying kinetic equation(s). Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Open AccessArticle Explaining Change in Language: A Cybersemiotic Perspective
Entropy 2009, 11(4), 1055-1072; doi:10.3390/e11041055
Received: 14 October 2009 / Accepted: 2 December 2009 / Published: 11 December 2009
Cited by 3 | PDF Full-text (484 KB) | HTML Full-text | XML Full-text
Abstract
One of the greatest conundrums in semiotics and linguistics is explaining why change occurs in communication systems. The descriptive apparatus of how change occurs has been developed in great detail since at least the nineteenth century, but a viable explanatory framework of why
[...] Read more.
One of the greatest conundrums in semiotics and linguistics is explaining why change occurs in communication systems. The descriptive apparatus of how change occurs has been developed in great detail since at least the nineteenth century, but a viable explanatory framework of why it occurs in the first place still seems to be clouded in vagueness. So far, only the so-called Principle of Least Effort has come forward to provide a suggestive psychobiological framework for understanding change in communication codes such as language. Extensive work in using this model has shown many fascinating things about language structure and how it evolves. However, the many findings need an integrative framework for shedding light on any generalities implicit in them. This paper argues that a new approach to the study of codes, called cybersemiotics, can be used to great advantage for assessing theoretical frameworks and notions such as the Principle of Least Effort. Amalgamating cybernetic and biosemiotic notions, this new science provides analysts with valuable insights on the raison d’être of phenomena such as linguistic change. Full article
Open AccessArticle Entropy-Based Wavelet De-noising Method for Time Series Analysis
Entropy 2009, 11(4), 1123-1147; doi:10.3390/e11041123
Received: 9 October 2009 / Accepted: 11 December 2009 / Published: 22 December 2009
Cited by 26 | PDF Full-text (657 KB) | HTML Full-text | XML Full-text
Abstract
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the
[...] Read more.
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the practical needs due to their inherent shortcomings. In the present paper, first a set of key but difficult wavelet de-noising problems are discussed, and then by applying information entropy theories to the wavelet de-noising process, i.e., using the principle of maximum entropy (POME) to describe the random character of the noise and using wavelet energy entropy to describe the degrees of complexity of the main series in original series data, a new entropy-based wavelet de-noising method is proposed. Analysis results of both several different synthetic series and typical observed time series data have verified the performance of the new method. A comprehensive discussion of the results indicates that compared with traditional wavelet de-noising methods, the new proposed method is more effective and universal. Furthermore, because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum. Besides, the analysis process of the new proposed method is simple and is easy to implement, so it would be more applicable and useful in applied sciences and practical engineering works. Full article
(This article belongs to the Special Issue Maximum Entropy)

Review

Jump to: Research, Other

Open AccessReview Optimal Thermodynamics—New Upperbounds
Entropy 2009, 11(4), 529-547; doi:10.3390/e11040529
Received: 31 August 2009 / Accepted: 23 September 2009 / Published: 28 September 2009
Cited by 41 | PDF Full-text (332 KB) | HTML Full-text | XML Full-text
Abstract
This paper reviews how ideas have evolved in this field from the pioneering work of CARNOT right up to the present. The coupling of thermostatics with thermokinetics (heat and mass transfers) and entropy or exergy analysis is illustrated through study of thermomechanical engines
[...] Read more.
This paper reviews how ideas have evolved in this field from the pioneering work of CARNOT right up to the present. The coupling of thermostatics with thermokinetics (heat and mass transfers) and entropy or exergy analysis is illustrated through study of thermomechanical engines such as the Carnot heat engine, and internal combustion engines. The benefits and importance of stagnation temperature and irreversibility parameters are underlined. The main situations of constrained (or unconstrained) optimization are defined, discussed and illustrated. The result of this study is a new branch of thermodynamics: Finite Dimensions Optimal Thermodynamics (FDOT). Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessReview The Maximum Entropy Formalism and the Prediction of Liquid Spray Drop-Size Distribution
Entropy 2009, 11(4), 713-747; doi:10.3390/e11040713
Received: 27 August 2009 / Accepted: 26 October 2009 / Published: 2 November 2009
Cited by 12 | PDF Full-text (367 KB) | HTML Full-text | XML Full-text
Abstract
The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are
[...] Read more.
The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are partially known and so far a universal model is not available. For almost thirty years, models based on the Maximum Entropy Formalism have been proposed to fulfill this task. This paper presents a review of these models emphasizing their similarities and differences, and discusses expectations of the use of this formalism to model spray drop-size distribution Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessReview The Use of Ideas of Information Theory for Studying “Language” and Intelligence in Ants
Entropy 2009, 11(4), 836-853; doi:10.3390/e11040836
Received: 21 September 2009 / Accepted: 4 November 2009 / Published: 10 November 2009
Cited by 7 | PDF Full-text (1264 KB) | HTML Full-text | XML Full-text
Abstract
In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a
[...] Read more.
In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a message (l) and its frequency (p), i.e., l = –log p for rational communication systems. This approach enabled us to obtain the following important results on ants’ communication and intelligence: (i) to reveal “distant homing” in ants, that is, their ability to transfer information about remote events; (ii) to estimate the rate of information transmission; (iii) to reveal that ants are able to grasp regularities and to use them for “compression” of information; (iv) to reveal that ants are able to transfer to each other the information about the number of objects; (v) to discover that ants can add and subtract small numbers. The obtained results show that information theory is not only excellent mathematical theory, but many of its results may be considered as Nature laws. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessReview Use of Maximum Entropy Modeling in Wildlife Research
Entropy 2009, 11(4), 854-866; doi:10.3390/e11040854
Received: 4 September 2009 / Accepted: 11 November 2009 / Published: 16 November 2009
Cited by 122 | PDF Full-text (149 KB) | HTML Full-text | XML Full-text
Abstract
Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models,
[...] Read more.
Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, and performs better than other presence-only modeling approaches. Further advances are needed to better define model thresholds, to test model significance, and to address model selection. Additionally, development of modeling approaches is needed when using repeated sampling of known individuals to assess habitat selection. These advancements would strengthen the utility of Maxent for wildlife research and management. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessReview The Variety of Information Transfer in Animal Sonic Communication: Review from a Physics Perspective
Entropy 2009, 11(4), 888-906; doi:10.3390/e11040888
Received: 13 October 2009 / Accepted: 17 November 2009 / Published: 18 November 2009
Cited by 3 | PDF Full-text (137 KB) | HTML Full-text | XML Full-text
Abstract
For many anatomical and physical reasons animals of different genera use widely different communication strategies. While some are chemical or visual, the most common involve sound or vibration and these signals can carry a large amount of information over long distances. The acoustic
[...] Read more.
For many anatomical and physical reasons animals of different genera use widely different communication strategies. While some are chemical or visual, the most common involve sound or vibration and these signals can carry a large amount of information over long distances. The acoustic signal varies greatly from one genus to another depending upon animal size, anatomy, physiology, and habitat, as also does the way in which information is encoded in the signal, but some general principles can be elucidated showing the possibilities and limitations for information transfer. Cases discussed range from insects through song birds to humans. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessReview Fisher Information and Semiclassical Treatments
Entropy 2009, 11(4), 972-992; doi:10.3390/e11040972
Received: 30 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
Cited by 12 | PDF Full-text (266 KB) | HTML Full-text | XML Full-text
Abstract
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal
[...] Read more.
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessReview Processing Information in Quantum Decision Theory
Entropy 2009, 11(4), 1073-1120; doi:10.3390/e11041073
Received: 28 October 2009 / Accepted: 10 December 2009 / Published: 14 December 2009
Cited by 29 | PDF Full-text (328 KB) | HTML Full-text | XML Full-text
Abstract
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures
[...] Read more.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms. Full article
(This article belongs to the Special Issue Information and Entropy)

Other

Jump to: Research, Review

Open AccessComment Comment on “Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems”, Entropy 2009, 11, 959-971
Entropy 2009, 11(4), 1121-1122; doi:10.3390/e11041121
Received: 11 December 2009 / Accepted: 18 December 2009 / Published: 22 December 2009
PDF Full-text (100 KB) | HTML Full-text | XML Full-text
Abstract The volume of the body enclosed by the n-dimensional Lamé curve defined by Ʃni=1 xbi = E is computed. Full article
(This article belongs to the Special Issue Information and Entropy)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top