E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Information and Entropy"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 October 2009)

Special Issue Editor

Guest Editor
Dr. Peter Harremoës * (Website)

Copenhagen Business College, Rønne Alle 1, st., DK-2860 Søborg, Denmark
Interests: symmetry; information divergence; cause and effect; Maxwell\'s demon; probability and statistics
* Dr. Harremoës also serves as the Editor-in-Chief of Entropy

Keywords

  • entropy
  • information
  • information theory

Published Papers (15 papers)

View options order results:
result details:
Displaying articles 1-15
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle Recovering Matrices of Economic Flows from Incomplete Data and a Composite Prior
Entropy 2010, 12(3), 516-527; doi:10.3390/e12030516
Received: 3 December 2009 / Accepted: 1 March 2010 / Published: 12 March 2010
Cited by 1 | PDF Full-text (194 KB) | HTML Full-text | XML Full-text
Abstract
In several socioeconomic applications, matrices containing information on flows-trade, income or migration flows, for example–are usually not constructed from direct observation but are rather estimated, since the compilation of the information required is often extremely expensive and time-consuming. The estimation process takes [...] Read more.
In several socioeconomic applications, matrices containing information on flows-trade, income or migration flows, for example–are usually not constructed from direct observation but are rather estimated, since the compilation of the information required is often extremely expensive and time-consuming. The estimation process takes as point of departure another matrix which is adjusted until it optimizes some divergence criterion and simultaneously is consistent with some partial information-row and column margins–of the target matrix. Among all the possible criteria to be considered, one of the most popular is the Kullback-Leibler divergence [1], leading to the well-known Cross-Entropy technique. This paper proposes the use of a composite Cross-Entropy approach that allows for introducing a mixture of two types of a priori information–two possible matrices to be included as point of departure in the estimation process. By means of a Monte Carlo simulation experiment, we will show that under some circumstances this approach outperforms other competing estimators. Besides, a real-world case with a matrix of interregional trade is included to show the applicability of the suggested technique. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle The Quantum-Classical Transition as an Information Flow
Entropy 2010, 12(1), 148-160; doi:10.3390/e12010148
Received: 21 October 2009 / Revised: 9 December 2009 / Accepted: 11 December 2009 / Published: 26 January 2010
Cited by 2 | PDF Full-text (620 KB) | HTML Full-text | XML Full-text
Abstract
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter [...] Read more.
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that the quantum-classical transition gets thereby described as the sign-reversal of the dominating direction of the information flow between classical and quantal variables. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle A Dynamic Model of Information and Entropy
Entropy 2010, 12(1), 80-88; doi:10.3390/e12010080
Received: 29 October 2009 / Accepted: 14 December 2009 / Published: 7 January 2010
Cited by 2 | PDF Full-text (158 KB) | HTML Full-text | XML Full-text
Abstract
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed [...] Read more.
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Redundancy in Systems Which Entertain a Model of Themselves: Interaction Information and the Self-Organization of Anticipation
Entropy 2010, 12(1), 63-79; doi:10.3390/e12010063
Received: 1 December 2009 / Accepted: 28 December 2009 / Published: 6 January 2010
Cited by 16 | PDF Full-text (637 KB)
Abstract
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction [...] Read more.
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information based on iterative approximation of maximum entropies. Q can then be considered as a measure of the difference between interaction information and redundancy generated in a model entertained by an observer. I argue that this provides us with a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing. The second-order system communicates meaning hyper-incursively; an observation instantiates this meaning-processing within the information processing. The net results may add to or reduce the prevailing uncertainty. The model is tested empirically for the case where textual organization can be expected to contain intellectual organization in terms of distributions of title words, author names, and cited references. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle On the Spectral Entropy of Thermodynamic Paths for Elementary Systems
Entropy 2009, 11(4), 1025-1041; doi:10.3390/e11041025
Received: 13 October 2009 / Accepted: 27 November 2009 / Published: 7 December 2009
PDF Full-text (425 KB) | HTML Full-text | XML Full-text
Abstract
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused [...] Read more.
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on the information expressed in thermodynamic pathways. Examined here is how spectral entropy is a by-product of information that depends intricately on the pathway structure. The spectral entropy has proven to be a valuable tool in diverse fields. This paper illustrates the contact between spectral entropy and the properties which distinguish ideal from non-ideal gases. The role of spectral entropy in the first and second laws of thermodynamics and heat → work conversions is also discussed. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessCommunication Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
Entropy 2009, 11(4), 993-1000; doi:10.3390/e11040993
Received: 15 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
Cited by 6 | PDF Full-text (178 KB) | HTML Full-text | XML Full-text
Abstract
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises [...] Read more.
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from the coordinated behavior of their parts. Coordination in turn necessitates recognition between parts, i.e., information exchange. What will be argued here is that the scope of recognition processes between parts is increased when preceded by their dispersal, which multiplies the number of encounters and creates a richer potential for recognition. A process intrinsic to emergence is dissolvence (aka submergence or top-down constraints), which participates in the information-entropy interplay underlying the creation, evolution and breakdown of higher-level entities. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
Entropy 2009, 11(4), 959-971; doi:10.3390/e11040959
Received: 3 November 2009 / Accepted: 30 November 2009 / Published: 2 December 2009
Cited by 4 | PDF Full-text (187 KB) | HTML Full-text | XML Full-text
Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked [...] Read more.
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle A Lower-Bound for the Maximin Redundancy in Pattern Coding
Entropy 2009, 11(4), 634-642; doi:10.3390/e11040634
Received: 1 September 2009 / Accepted: 20 October 2009 / Published: 22 October 2009
Cited by 5 | PDF Full-text (143 KB) | HTML Full-text | XML Full-text
Abstract
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between [...] Read more.
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds. The pattern of a string is obtained by replacing each symbol by the index of its first occurrence. The problem of pattern coding is of interest because strongly universal codes have been proved to exist for patterns while universal message coding is impossible for memoryless sources on an infinite alphabet. The proof uses fine combinatorial results on partitions with small summands. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Landauer’s Principle and Divergenceless Dynamical Systems
Entropy 2009, 11(4), 586-597; doi:10.3390/e11040586
Received: 17 August 2009 / Accepted: 15 September 2009 / Published: 13 October 2009
Cited by 4 | PDF Full-text (175 KB) | HTML Full-text | XML Full-text
Abstract
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of [...] Read more.
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information is erased from a computing device. Here we explore an extended Landauerlike principle valid for general dynamical systems (not necessarily Hamiltonian) governed by divergenceless phase space flows. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Scale-Based Gaussian Coverings: Combining Intra and Inter Mixture Models in Image Segmentation
Entropy 2009, 11(3), 513-528; doi:10.3390/e11030513
Received: 1 September 2009 / Accepted: 14 September 2009 / Published: 24 September 2009
Cited by 2 | PDF Full-text (387 KB) | HTML Full-text | XML Full-text
Abstract
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we [...] Read more.
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose the Rényi quadratic entropy as an excellent and tractable model comparison framework. We exemplify this using the segmentation of an MRI image volume, based (1) on a direct Gaussian mixture model applied to the marginal distribution function, and (2) Gaussian model fit through k-means applied to the 4D multivalued image volume furnished by the wavelet transform. Visual preference for one model over another is not immediate. The Rényi quadratic entropy allows us to show clearly that one of these modelings is superior to the other. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Information, Deformed қ-Wehrl Entropies and Semiclassical Delocalization
Entropy 2009, 11(1), 32-41; doi:10.3390/e11010032
Received: 5 November 2008 / Accepted: 20 January 2009 / Published: 27 January 2009
Cited by 4 | PDF Full-text (221 KB)
Abstract
Semiclassical delocalization in phase space constitutes a manifestation of the Uncertainty Principle, one indispensable part of the present understanding of Nature and the Wehrl entropy is widely regarded as the foremost localization-indicator. We readdress the matter here within the framework of the [...] Read more.
Semiclassical delocalization in phase space constitutes a manifestation of the Uncertainty Principle, one indispensable part of the present understanding of Nature and the Wehrl entropy is widely regarded as the foremost localization-indicator. We readdress the matter here within the framework of the celebrated semiclassical Husimi distributions and their associatedWehrl entropies, suitably қ-deformed. We are able to show that it is possible to significantly improve on the extant phase-space classical-localization power. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle Generalized Measure of Departure from No Three-Factor Interaction Model for 2 x 2 x K Contingency Tables
Entropy 2008, 10(4), 776-785; doi:10.3390/e10040776
Received: 31 October 2008 / Accepted: 16 December 2008 / Published: 22 December 2008
PDF Full-text (181 KB) | HTML Full-text | XML Full-text
Abstract
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 [...] Read more.
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x 2 x K tables. The measure proposed is expressed by using Patil-Taillie diversity index or Cressie-Read power-divergence. A special case of the proposed measure includes Tomizawa's measure. The proposed measure would be useful for comparing the degrees of departure from the NOTFI model in several tables. Full article
(This article belongs to the Special Issue Information and Entropy)

Review

Jump to: Research, Other

Open AccessReview Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics
Entropy 2010, 12(5), 1194-1245; doi:10.3390/e12051194
Received: 10 February 2010 / Accepted: 30 April 2010 / Published: 7 May 2010
Cited by 6 | PDF Full-text (525 KB)
Abstract
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the [...] Read more.
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60). Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessReview Processing Information in Quantum Decision Theory
Entropy 2009, 11(4), 1073-1120; doi:10.3390/e11041073
Received: 28 October 2009 / Accepted: 10 December 2009 / Published: 14 December 2009
Cited by 27 | PDF Full-text (328 KB) | HTML Full-text | XML Full-text
Abstract
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure [...] Read more.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms. Full article
(This article belongs to the Special Issue Information and Entropy)

Other

Jump to: Research, Review

Open AccessComment Comment on “Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems”, Entropy 2009, 11, 959-971
Entropy 2009, 11(4), 1121-1122; doi:10.3390/e11041121
Received: 11 December 2009 / Accepted: 18 December 2009 / Published: 22 December 2009
PDF Full-text (100 KB) | HTML Full-text | XML Full-text
Abstract The volume of the body enclosed by the n-dimensional Lamé curve defined by Ʃni=1 xbi = E is computed. Full article
(This article belongs to the Special Issue Information and Entropy)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top