Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 13, Issue 6 (June 2011), Pages 1055-1211

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-7
Export citation of selected articles as:

Research

Open AccessArticle Distances in Probability Space and the Statistical Complexity Setup
Entropy 2011, 13(6), 1055-1075; doi:10.3390/e13061055
Received: 11 April 2011 / Accepted: 27 May 2011 / Published: 3 June 2011
Cited by 13 | PDF Full-text (532 KB) | HTML Full-text | XML Full-text
Abstract
Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important
[...] Read more.
Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a “disequilibrium” and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle A Philosophical Treatise of Universal Induction
Entropy 2011, 13(6), 1076-1136; doi:10.3390/e13061076
Received: 20 April 2011 / Revised: 24 May 2011 / Accepted: 27 May 2011 / Published: 3 June 2011
Cited by 24 | PDF Full-text (425 KB)
Abstract
Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This problem is relevant to a wide range of fields and is integral to the philosophy of science. It has been tackled by many great minds ranging from philosophers
[...] Read more.
Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This problem is relevant to a wide range of fields and is integral to the philosophy of science. It has been tackled by many great minds ranging from philosophers to scientists to mathematicians, and more recently computer scientists. In this article we argue the case for Solomonoff Induction, a formal inductive framework which combines algorithmic information theory with the Bayesian framework. Although it achieves excellent theoretical results and is based on solid philosophical foundations, the requisite technical knowledge necessary for understanding this framework has caused it to remain largely unknown and unappreciated in the wider scientific community. The main contribution of this article is to convey Solomonoff induction and its related concepts in a generally accessible form with the aim of bridging this current technical gap. In the process we examine the major historical contributions that have led to the formulation of Solomonoff Induction as well as criticisms of Solomonoff and induction in general. In particular we examine how Solomonoff induction addresses many issues that have plagued other inductive systems, such as the black ravens paradox and the confirmation problem, and compare this approach with other recent approaches. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
Figures

Open AccessArticle Maximum Profit Configurations of Commercial Engines
Entropy 2011, 13(6), 1137-1151; doi:10.3390/e13061137
Received: 19 April 2011 / Revised: 20 May 2011 / Accepted: 31 May 2011 / Published: 7 June 2011
Cited by 1 | PDF Full-text (127 KB) | HTML Full-text | XML Full-text
Abstract
An investigation of commercial engines with finite capacity low- and high-price economic subsystems and a generalized commodity transfer law [n ∝ Δ (P m)] in commodity flow processes, in which effects of the price elasticities of supply and demand are introduced,
[...] Read more.
An investigation of commercial engines with finite capacity low- and high-price economic subsystems and a generalized commodity transfer law [n ∝ Δ (P m)] in commodity flow processes, in which effects of the price elasticities of supply and demand are introduced, is presented in this paper. Optimal cycle configurations of commercial engines for maximum profit are obtained by applying optimal control theory. In some special cases, the eventual state—market equilibrium—is solely determined by the initial conditions and the inherent characteristics of two subsystems; while the different ways of transfer affect the model in respects of the specific forms of the paths of prices and the instantaneous commodity flow, i.e., the optimal configuration. Full article
Open AccessArticle EA/G-GA for Single Machine Scheduling Problems with Earliness/Tardiness Costs
Entropy 2011, 13(6), 1152-1169; doi:10.3390/e13061152
Received: 10 May 2011 / Revised: 30 May 2011 / Accepted: 10 June 2011 / Published: 14 June 2011
Cited by 6 | PDF Full-text (3212 KB) | HTML Full-text | XML Full-text
Abstract
An Estimation of Distribution Algorithm (EDA), which depends on explicitly sampling mechanisms based on probabilistic models with information extracted from the parental solutions to generate new solutions, has constituted one of the major research areas in the field of evolutionary computation. The fact
[...] Read more.
An Estimation of Distribution Algorithm (EDA), which depends on explicitly sampling mechanisms based on probabilistic models with information extracted from the parental solutions to generate new solutions, has constituted one of the major research areas in the field of evolutionary computation. The fact that no genetic operators are used in EDAs is a major characteristic differentiating EDAs from other genetic algorithms (GAs). This advantage, however, could lead to premature convergence of EDAs as the probabilistic models are no longer generating diversified solutions. In our previous research [1], we have presented the evidences that EDAs suffer from the drawback of premature convergency, thus several important guidelines are provided for the design of effective EDAs. In this paper, we validated one guideline for incorporating other meta-heuristics into the EDAs. An algorithm named “EA/G-GA” is proposed by selecting a well-known EDA, EA/G, to work with GAs. The proposed algorithm was tested on the NP-Hard single machine scheduling problems with the total weighted earliness/tardiness cost in a just-in-time environment. The experimental results indicated that the EA/G-GA outperforms the compared algorithms statistically significantly across different stopping criteria and demonstrated the robustness of the proposed algorithm. Consequently, this paper is of interest and importance in the field of EDAs. Full article
Open AccessArticle Geometry of q-Exponential Family of Probability Distributions
Entropy 2011, 13(6), 1170-1185; doi:10.3390/e13061170
Received: 11 February 2011 / Revised: 1 June 2011 / Accepted: 2 June 2011 / Published: 14 June 2011
Cited by 21 | PDF Full-text (157 KB) | HTML Full-text | XML Full-text
Abstract
The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than
[...] Read more.
The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability) estimator. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Figures

Open AccessArticle On the Thermodynamics of Classical Micro-Canonical Systems
Entropy 2011, 13(6), 1186-1199; doi:10.3390/e13061186
Received: 18 May 2011 / Revised: 15 June 2011 / Accepted: 16 June 2011 / Published: 21 June 2011
Cited by 6 | PDF Full-text (140 KB) | HTML Full-text | XML Full-text
Abstract
We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem.
[...] Read more.
We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem. The second argument is that the instability of the pendulum, which occurs for energies close to the rotation threshold, is correctly reproduced. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Open AccessArticle Effective Complexity of Stationary Process Realizations
Entropy 2011, 13(6), 1200-1211; doi:10.3390/e13061200
Received: 8 May 2011 / Revised: 15 June 2011 / Accepted: 17 June 2011 / Published: 22 June 2011
Cited by 2 | PDF Full-text (113 KB) | HTML Full-text | XML Full-text
Abstract
The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, which is the probability distributions on finite binary strings. In our previous
[...] Read more.
The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, which is the probability distributions on finite binary strings. In our previous paper [1] we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that needs less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow. Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top