Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 19, Issue 5 (May 2017)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story Any system can be described at a multitude of spatial and temporal scales. One long-standing [...] Read more.
View options order results:
result details:
Displaying articles 1-54
Export citation of selected articles as:

Research

Jump to: Other

Open AccessArticle Effects of Movable-Baffle on Heat Transfer and Entropy Generation in a Cavity Saturated by CNT Suspensions: Three-Dimensional Modeling
Entropy 2017, 19(5), 200; doi:10.3390/e19050200
Received: 3 March 2017 / Revised: 21 April 2017 / Accepted: 26 April 2017 / Published: 29 April 2017
PDF Full-text (8165 KB) | HTML Full-text | XML Full-text
Abstract
Convective heat transfer and entropy generation in a 3D closed cavity, equipped with adiabatic-driven baffle and filled with CNT (carbon nanotube)-water nanofluid, are numerically investigated for a range of Rayleigh numbers from 103 to 105. This research is conducted for
[...] Read more.
Convective heat transfer and entropy generation in a 3D closed cavity, equipped with adiabatic-driven baffle and filled with CNT (carbon nanotube)-water nanofluid, are numerically investigated for a range of Rayleigh numbers from 103 to 105. This research is conducted for three configurations; fixed baffle (V = 0), rotating baffle clockwise (V+) and rotating baffle counterclockwise (V−) and a range of CNT concentrations from 0 to 15%. Governing equations are formulated using potential vector vorticity formulation in its three-dimensional form, then solved by the finite volume method. The effects of motion direction of the inserted driven baffle and CNT concentration on heat transfer and entropy generation are studied. It was observed that for low Rayleigh numbers, the motion of the driven baffle enhances heat transfer regardless of its direction and the CNT concentration effect is negligible. However, with an increasing Rayleigh number, adding driven baffle increases the heat transfer only when it moves in the direction of the decreasing temperature gradient; elsewhere, convective heat transfer cannot be enhanced due to flow blockage at the corners of the baffle. Full article
(This article belongs to the Special Issue Entropy Generation in Nanofluid Flows)
Figures

Figure 1

Open AccessArticle Utility, Revealed Preferences Theory, and Strategic Ambiguity in Iterated Games
Entropy 2017, 19(5), 201; doi:10.3390/e19050201
Received: 28 February 2017 / Revised: 10 April 2017 / Accepted: 26 April 2017 / Published: 29 April 2017
PDF Full-text (458 KB) | HTML Full-text | XML Full-text
Abstract
Iterated games, in which the same economic interaction is repeatedly played between the same agents, are an important framework for understanding the effectiveness of strategic choices over time. To date, very little work has applied information theory to the information sets used by
[...] Read more.
Iterated games, in which the same economic interaction is repeatedly played between the same agents, are an important framework for understanding the effectiveness of strategic choices over time. To date, very little work has applied information theory to the information sets used by agents in order to decide what action to take next in such strategic situations. This article looks at the mutual information between previous game states and an agent’s next action by introducing two new classes of games: “invertible games” and “cyclical games”. By explicitly expanding out the mutual information between past states and the next action we show under what circumstances the explicit values of the utility are irrelevant for iterated games and this is then related to revealed preferences theory of classical economics. These information measures are then applied to the Traveler’s Dilemma game and the Prisoner’s Dilemma game, the Prisoner’s Dilemma being invertible, to illustrate their use. In the Prisoner’s Dilemma, a novel connection is made between the computational principles of logic gates and both the structure of games and the agents’ decision strategies. This approach is applied to the cyclical game Matching Pennies to analyse the foundations of a behavioural ambiguity between two well studied strategies: “Tit-for-Tat” and “Win-Stay, Lose-Switch”. Full article
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
Figures

Figure 1

Open AccessArticle Kinetics of Interactions of Matter, Antimatter and Radiation Consistent with Antisymmetric (CPT-Invariant) Thermodynamics
Entropy 2017, 19(5), 202; doi:10.3390/e19050202
Received: 12 March 2017 / Revised: 25 April 2017 / Accepted: 26 April 2017 / Published: 2 May 2017
PDF Full-text (406 KB) | HTML Full-text | XML Full-text
Abstract
This work investigates the influence of directional properties of decoherence on kinetics rate equations. The physical reality is understood as a chain of unitary and decoherence events. The former are quantum-deterministic, while the latter introduce uncertainty and increase entropy. For interactions of matter
[...] Read more.
This work investigates the influence of directional properties of decoherence on kinetics rate equations. The physical reality is understood as a chain of unitary and decoherence events. The former are quantum-deterministic, while the latter introduce uncertainty and increase entropy. For interactions of matter and antimatter, two approaches are considered: symmetric decoherence, which corresponds to conventional symmetric (CP-invariant) thermodynamics, and antisymmetric decoherence, which corresponds to antisymmetric (CPT-invariant) thermodynamics. Radiation, in its interactions with matter and antimatter, is shown to be decoherence-neutral. The symmetric and antisymmetric assumptions result in different interactions of radiation with matter and antimatter. The theoretical predictions for these differences are testable by comparing absorption (emission) of light by thermodynamic systems made of matter and antimatter. Canonical typicality for quantum mixtures is briefly discussed in Appendix A. Full article
(This article belongs to the Special Issue Quantum Thermodynamics)
Figures

Figure 1

Open AccessArticle Fractional Diffusion in a Solid with Mass Absorption
Entropy 2017, 19(5), 203; doi:10.3390/e19050203
Received: 28 March 2017 / Revised: 24 April 2017 / Accepted: 29 April 2017 / Published: 2 May 2017
PDF Full-text (396 KB) | HTML Full-text | XML Full-text
Abstract
The space-time-fractional diffusion equation with the Caputo time-fractional derivative and Riesz fractional Laplacian is considered in the case of axial symmetry. Mass absorption (mass release) is described by a source term proportional to concentration. The integral transform technique is used. Different particular cases
[...] Read more.
The space-time-fractional diffusion equation with the Caputo time-fractional derivative and Riesz fractional Laplacian is considered in the case of axial symmetry. Mass absorption (mass release) is described by a source term proportional to concentration. The integral transform technique is used. Different particular cases of the solution are studied. The numerical results are illustrated graphically. Full article
(This article belongs to the Special Issue Complex Systems, Non-Equilibrium Dynamics and Self-Organisation)
Figures

Figure 1

Open AccessArticle Measures of Qualitative Variation in the Case of Maximum Entropy
Entropy 2017, 19(5), 204; doi:10.3390/e19050204
Received: 13 March 2017 / Revised: 21 April 2017 / Accepted: 27 April 2017 / Published: 4 May 2017
PDF Full-text (906 KB) | HTML Full-text | XML Full-text
Abstract
Asymptotic behavior of qualitative variation statistics, including entropy measures, can be modeled well by normal distributions. In this study, we test the normality of various qualitative variation measures in general. We find that almost all indices tend to normality as the sample size
[...] Read more.
Asymptotic behavior of qualitative variation statistics, including entropy measures, can be modeled well by normal distributions. In this study, we test the normality of various qualitative variation measures in general. We find that almost all indices tend to normality as the sample size increases, and they are highly correlated. However, for all of these qualitative variation statistics, maximum uncertainty is a serious factor that prevents normality. Among these, we study the properties of two qualitative variation statistics; VarNC and StDev statistics in the case of maximum uncertainty, since these two statistics show lower sampling variability and utilize all sample information. We derive probability distribution functions of these statistics and prove that they are consistent. We also discuss the relationship between VarNC and the normalized form of Tsallis (α = 2) entropy in the case of maximum uncertainty. Full article
(This article belongs to the Special Issue Maximum Entropy and Its Application II)
Figures

Figure 1

Open AccessArticle Divergence and Sufficiency for Convex Optimization
Entropy 2017, 19(5), 206; doi:10.3390/e19050206
Received: 30 December 2016 / Revised: 11 April 2017 / Accepted: 2 May 2017 / Published: 3 May 2017
PDF Full-text (631 KB) | HTML Full-text | XML Full-text
Abstract
Logarithmic score and information divergence appear in information theory, statistics, statistical mechanics, and portfolio theory. We demonstrate that all these topics involve some kind of optimization that leads directly to regret functions and such regret functions are often given by Bregman divergences. If
[...] Read more.
Logarithmic score and information divergence appear in information theory, statistics, statistical mechanics, and portfolio theory. We demonstrate that all these topics involve some kind of optimization that leads directly to regret functions and such regret functions are often given by Bregman divergences. If a regret function also fulfills a sufficiency condition it must be proportional to information divergence. We will demonstrate that sufficiency is equivalent to the apparently weaker notion of locality and it is also equivalent to the apparently stronger notion of monotonicity. These sufficiency conditions have quite different relevance in the different areas of application, and often they are not fulfilled. Therefore sufficiency conditions can be used to explain when results from one area can be transferred directly to another and when one will experience differences. Full article
(This article belongs to the Special Issue Convex Optimization and Entropy)
Figures

Figure 1

Open AccessArticle Coarse Graining Shannon and von Neumann Entropies
Entropy 2017, 19(5), 207; doi:10.3390/e19050207
Received: 4 April 2017 / Revised: 27 April 2017 / Accepted: 28 April 2017 / Published: 3 May 2017
PDF Full-text (292 KB) | HTML Full-text | XML Full-text
Abstract
The nature of coarse graining is intuitively “obvious”, but it is rather difficult to find explicit and calculable models of the coarse graining process (and the resulting entropy flow) discussed in the literature. What we would like to have at hand is some
[...] Read more.
The nature of coarse graining is intuitively “obvious”, but it is rather difficult to find explicit and calculable models of the coarse graining process (and the resulting entropy flow) discussed in the literature. What we would like to have at hand is some explicit and calculable process that takes an arbitrary system, with specified initial entropy S, and that monotonically and controllably drives the entropy to its maximum value. This does not have to be a physical process, in fact for some purposes it is better to deal with a gedanken-process, since then it is more obvious how the “hidden information” is hiding in the fine-grain correlations that one is simply agreeing not to look at. We shall present several simple mathematically well-defined and easy to work with conceptual models for coarse graining. We shall consider both the classical Shannon and quantum von Neumann entropies, including models based on quantum decoherence, and analyse the entropy flow in some detail. When coarse graining the quantum von Neumann entropy, we find it extremely useful to introduce an adaptation of Hawking’s super-scattering matrix. These explicit models that we shall construct allow us to quantify and keep clear track of the entropy that appears when coarse graining the system and the information that can be hidden in unobserved correlations (while not the focus of the current article, in the long run, these considerations are of interest when addressing the black hole information puzzle). Full article
(This article belongs to the Special Issue Black Hole Thermodynamics II)
Open AccessArticle Objective Bayesian Entropy Inference for Two-Parameter Logistic Distribution Using Upper Record Values
Entropy 2017, 19(5), 208; doi:10.3390/e19050208
Received: 24 March 2017 / Revised: 28 April 2017 / Accepted: 29 April 2017 / Published: 3 May 2017
PDF Full-text (524 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we provide an entropy inference method that is based on an objective Bayesian approach for upper record values having a two-parameter logistic distribution. We derive the entropy that is based on the i-th upper record value and the joint
[...] Read more.
In this paper, we provide an entropy inference method that is based on an objective Bayesian approach for upper record values having a two-parameter logistic distribution. We derive the entropy that is based on the i-th upper record value and the joint entropy that is based on the upper record values. Moreover, we examine their properties. For objective Bayesian analysis, we obtain objective priors, namely, the Jeffreys and reference priors, for the unknown parameters of the logistic distribution. The priors are based on upper record values. Then, we develop an entropy inference method that is based on these objective priors. In real data analysis, we assess the quality of the proposed models under the objective priors and compare them with the model under the informative prior. Full article
Figures

Figure 1

Open AccessArticle Ensemble Averages, Soliton Dynamics and Influence of Haptotaxis in a Model of Tumor-Induced Angiogenesis
Entropy 2017, 19(5), 209; doi:10.3390/e19050209
Received: 7 April 2017 / Revised: 27 April 2017 / Accepted: 2 May 2017 / Published: 4 May 2017
PDF Full-text (675 KB) | HTML Full-text | XML Full-text
Abstract
In this work, we present a numerical study of the influence of matrix degrading enzyme (MDE) dynamics and haptotaxis on the development of vessel networks in tumor-induced angiogenesis. Avascular tumors produce growth factors that induce nearby blood vessels to emit sprouts formed by
[...] Read more.
In this work, we present a numerical study of the influence of matrix degrading enzyme (MDE) dynamics and haptotaxis on the development of vessel networks in tumor-induced angiogenesis. Avascular tumors produce growth factors that induce nearby blood vessels to emit sprouts formed by endothelial cells. These capillary sprouts advance toward the tumor by chemotaxis (gradients of growth factor) and haptotaxis (adhesion to the tissue matrix outside blood vessels). The motion of the capillaries in this constrained space is modelled by stochastic processes (Langevin equations, branching and merging of sprouts) coupled to continuum equations for concentrations of involved substances. There is a complementary deterministic description in terms of the density of actively moving tips of vessel sprouts. The latter forms a stable soliton-like wave whose motion is influenced by the different taxis mechanisms. We show the delaying effect of haptotaxis on the advance of the angiogenic vessel network by direct numerical simulations of the stochastic process and by a study of the soliton motion. Full article
(This article belongs to the Special Issue Statistical Mechanics of Complex and Disordered Systems)
Figures

Figure 1

Open AccessArticle Information Content Based Optimal Radar Waveform Design: LPI’s Purpose
Entropy 2017, 19(5), 210; doi:10.3390/e19050210
Received: 26 March 2017 / Revised: 30 April 2017 / Accepted: 3 May 2017 / Published: 6 May 2017
PDF Full-text (1144 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents a low probability of interception (LPI) radar waveform design method with a fixed average power constraint based on information theory. The Kullback–Leibler divergence (KLD) between the intercept signal and background noise is presented as a practical metric to evaluate the
[...] Read more.
This paper presents a low probability of interception (LPI) radar waveform design method with a fixed average power constraint based on information theory. The Kullback–Leibler divergence (KLD) between the intercept signal and background noise is presented as a practical metric to evaluate the performance of the adversary intercept receiver in this paper. Through combining it with the radar performance metric, that is, the mutual information (MI), a multi-objective optimization model of LPI waveform design is developed. It is a trade-off between the performance of radar and enemy intercept receiver. After being transformed into a single-objective optimization problem, it can be solved by using an interior point method and a sequential quadratic programming (SQP) method. Simulation results verify the correctness and effectiveness of the proposed LPI radar waveform design method. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle Meromorphic Non-Integrability of Several 3D Dynamical Systems
Entropy 2017, 19(5), 211; doi:10.3390/e19050211
Received: 4 March 2017 / Revised: 17 April 2017 / Accepted: 29 April 2017 / Published: 10 May 2017
PDF Full-text (279 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we apply the differential Galoisian approach to investigate the meromorphic non-integrability of a class of 3D equations in mathematical physics, including Nosé–Hoover equations, the Lü system, the Rikitake-like system and Rucklidge equations, which are well known in the fields of
[...] Read more.
In this paper, we apply the differential Galoisian approach to investigate the meromorphic non-integrability of a class of 3D equations in mathematical physics, including Nosé–Hoover equations, the Lü system, the Rikitake-like system and Rucklidge equations, which are well known in the fields of molecular dynamics, chaotic theory and fluid mechanics, respectively. Our main results show that all these considered systems are, in fact, non-integrable in nearly all parameters. Full article
(This article belongs to the Special Issue Complex Systems, Non-Equilibrium Dynamics and Self-Organisation)
Open AccessArticle Boltzmann Entropy of a Newtonian Universe
Entropy 2017, 19(5), 212; doi:10.3390/e19050212
Received: 5 April 2017 / Revised: 3 May 2017 / Accepted: 4 May 2017 / Published: 6 May 2017
PDF Full-text (256 KB) | HTML Full-text | XML Full-text
Abstract
A dynamical estimate is given for the Boltzmann entropy of the Universe, under the simplifying assumptions provided by Newtonian cosmology. We first model the cosmological fluid as the probability fluid of a quantum-mechanical system. Next, following current ideas about the emergence of spacetime,
[...] Read more.
A dynamical estimate is given for the Boltzmann entropy of the Universe, under the simplifying assumptions provided by Newtonian cosmology. We first model the cosmological fluid as the probability fluid of a quantum-mechanical system. Next, following current ideas about the emergence of spacetime, we regard gravitational equipotentials as isoentropic surfaces. Therefore, gravitational entropy is proportional to the vacuum expectation value of the gravitational potential in a certain quantum state describing the matter contents of the Universe. The entropy of the matter sector can also be computed. While providing values of the entropy that turn out to be somewhat higher than existing estimates, our results are in perfect compliance with the upper bound set by the holographic principle. Full article
(This article belongs to the Section Astrophysics and Cosmology)
Open AccessArticle Cockroach Swarm Optimization Algorithm for Travel Planning
Entropy 2017, 19(5), 213; doi:10.3390/e19050213
Received: 27 February 2017 / Revised: 29 April 2017 / Accepted: 3 May 2017 / Published: 6 May 2017
PDF Full-text (1219 KB) | HTML Full-text | XML Full-text
Abstract
In transport planning, one should allow passengers to travel through the complicated transportation scheme with efficient use of different modes of transport. In this paper, we propose the use of a cockroach swarm optimization algorithm for determining paths with the shortest travel time.
[...] Read more.
In transport planning, one should allow passengers to travel through the complicated transportation scheme with efficient use of different modes of transport. In this paper, we propose the use of a cockroach swarm optimization algorithm for determining paths with the shortest travel time. In our approach, this algorithm has been modified to work with the time-expanded model. Therefore, we present how the algorithm has to be adapted to this model, including correctly creating solutions and defining steps and movement in the search space. By introducing the proposed modifications, we are able to solve journey planning. The results have shown that the performance of our approach, in terms of converging to the best solutions, is satisfactory. Moreover, we have compared our results with Dijkstra’s algorithm and a particle swarm optimization algorithm. Full article
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
Figures

Figure 1

Open AccessArticle The Solution of Modified Fractional Bergman’s Minimal Blood Glucose-Insulin Model
Entropy 2017, 19(5), 114; doi:10.3390/e19050114
Received: 21 January 2017 / Revised: 4 March 2017 / Accepted: 9 March 2017 / Published: 2 May 2017
PDF Full-text (283 KB) | HTML Full-text | XML Full-text
Abstract
In the present paper, we use analytical techniques to solve fractional nonlinear differential equations systems that arise in Bergman’s minimal model, used to describe blood glucose and insulin metabolism, after intravenous tolerance testing. We also discuss the stability and uniqueness of the solution.
[...] Read more.
In the present paper, we use analytical techniques to solve fractional nonlinear differential equations systems that arise in Bergman’s minimal model, used to describe blood glucose and insulin metabolism, after intravenous tolerance testing. We also discuss the stability and uniqueness of the solution. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory II)
Figures

Figure 1

Open AccessArticle Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems
Entropy 2017, 19(5), 214; doi:10.3390/e19050214
Received: 22 January 2017 / Revised: 22 March 2017 / Accepted: 3 May 2017 / Published: 8 May 2017
PDF Full-text (1143 KB) | HTML Full-text | XML Full-text
Abstract
Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system-
[...] Read more.
Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decomposing the system’s thermodynamic entropy density into a localized entropy, that is solely contained in the dynamics at a single location, and a bound entropy, that is stored in space as domains, clusters, excitations, or other emergent structures. As a concrete demonstration, we compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, on the Bethe lattice with coordination number k = 3 , and on the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that different spin motifs play (in cluster bulk, cluster edges, and the like) and how these affect the dependencies between spins. Full article
(This article belongs to the Section Statistical Mechanics)
Figures

Figure 1

Open AccessArticle Cauchy Principal Value Contour Integral with Applications
Entropy 2017, 19(5), 215; doi:10.3390/e19050215
Received: 28 March 2017 / Revised: 28 April 2017 / Accepted: 3 May 2017 / Published: 10 May 2017
PDF Full-text (278 KB) | HTML Full-text | XML Full-text
Abstract
Cauchy principal value is a standard method applied in mathematical applications by which an improper, and possibly divergent, integral is measured in a balanced way around singularities or at infinity. On the other hand, entropy prediction of systems behavior from a thermodynamic perspective
[...] Read more.
Cauchy principal value is a standard method applied in mathematical applications by which an improper, and possibly divergent, integral is measured in a balanced way around singularities or at infinity. On the other hand, entropy prediction of systems behavior from a thermodynamic perspective commonly involves contour integrals. With the aim of facilitating the calculus of such integrals in this entropic scenario, we revisit the generalization of Cauchy principal value to complex contour integral, formalize its definition and—by using residue theory techniques—provide an useful way to evaluate them. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory III)
Open AccessArticle Designing Labeled Graph Classifiers by Exploiting the Rényi Entropy of the Dissimilarity Representation
Entropy 2017, 19(5), 216; doi:10.3390/e19050216
Received: 6 April 2017 / Revised: 26 April 2017 / Accepted: 6 May 2017 / Published: 9 May 2017
PDF Full-text (459 KB) | HTML Full-text | XML Full-text
Abstract
Representing patterns as labeled graphs is becoming increasingly common in the broad field of computational intelligence. Accordingly, a wide repertoire of pattern recognition tools, such as classifiers and knowledge discovery procedures, are nowadays available and tested for various datasets of labeled graphs. However,
[...] Read more.
Representing patterns as labeled graphs is becoming increasingly common in the broad field of computational intelligence. Accordingly, a wide repertoire of pattern recognition tools, such as classifiers and knowledge discovery procedures, are nowadays available and tested for various datasets of labeled graphs. However, the design of effective learning procedures operating in the space of labeled graphs is still a challenging problem, especially from the computational complexity viewpoint. In this paper, we present a major improvement of a general-purpose classifier for graphs, which is conceived on an interplay between dissimilarity representation, clustering, information-theoretic techniques, and evolutionary optimization algorithms. The improvement focuses on a specific key subroutine devised to compress the input data. We prove different theorems which are fundamental to the setting of the parameters controlling such a compression operation. We demonstrate the effectiveness of the resulting classifier by benchmarking the developed variants on well-known datasets of labeled graphs, considering as distinct performance indicators the classification accuracy, computing time, and parsimony in terms of structural complexity of the synthesized classification models. The results show state-of-the-art standards in terms of test set accuracy and a considerable speed-up for what concerns the computing time. Full article
(This article belongs to the Special Issue Entropy in Signal Analysis)
Figures

Figure 1

Open AccessArticle On the Convergence and Law of Large Numbers for the Non-Euclidean Lp -Means
Entropy 2017, 19(5), 217; doi:10.3390/e19050217
Received: 18 January 2017 / Revised: 13 April 2017 / Accepted: 9 May 2017 / Published: 11 May 2017
PDF Full-text (975 KB) | HTML Full-text | XML Full-text
Abstract
This paper describes and proves two important theorems that compose the Law of Large Numbers for the non-Euclidean Lp-means, known to be true for the Euclidean L2-means: Let the Lp-mean estimator, which constitutes the specific functional that
[...] Read more.
This paper describes and proves two important theorems that compose the Law of Large Numbers for the non-Euclidean L p -means, known to be true for the Euclidean L 2 -means: Let the L p -mean estimator, which constitutes the specific functional that estimates the L p -mean of N independent and identically distributed random variables; then, (i) the expectation value of the L p -mean estimator equals the mean of the distributions of the random variables; and (ii) the limit N of the L p -mean estimator also equals the mean of the distributions. Full article
Figures

Figure 1

Open AccessArticle Minimum Entropy Active Fault Tolerant Control of the Non-Gaussian Stochastic Distribution System Subjected to Mean Constraint
Entropy 2017, 19(5), 218; doi:10.3390/e19050218
Received: 23 March 2017 / Revised: 24 April 2017 / Accepted: 3 May 2017 / Published: 11 May 2017
PDF Full-text (2297 KB) | HTML Full-text | XML Full-text
Abstract
Stochastic distribution control (SDC) systems are a group of systems where the outputs considered is the measured probability density function (PDF) of the system output whilst subjected to a normal crisp input. The purpose of the active fault tolerant control of such systems
[...] Read more.
Stochastic distribution control (SDC) systems are a group of systems where the outputs considered is the measured probability density function (PDF) of the system output whilst subjected to a normal crisp input. The purpose of the active fault tolerant control of such systems is to use the fault estimation information and other measured information to make the output PDF still track the given distribution when the objective PDF is known. However, if the target PDF is unavailable, the PDF tracking operation will be impossible. Minimum entropy control of the system output can be considered as an alternative strategy. The mean represents the center location of the stochastic variable, and it is reasonable that the minimum entropy fault tolerant controller can be designed subjected to mean constraint. In this paper, using the rational square-root B-spline model for the shape control of the system output probability density function (PDF), a nonlinear adaptive observer based fault diagnosis algorithm is proposed to diagnose the fault. Through the controller reconfiguration, the system entropy subjected to mean restriction can still be minimized when fault occurs. An illustrative example is utilized to demonstrate the use of the minimum entropy fault tolerant control algorithms. Full article
Figures

Figure 1

Open AccessArticle Calculating Iso-Committor Surfaces as Optimal Reaction Coordinates with Milestoning
Entropy 2017, 19(5), 219; doi:10.3390/e19050219
Received: 17 February 2017 / Revised: 24 April 2017 / Accepted: 8 May 2017 / Published: 11 May 2017
PDF Full-text (2983 KB) | HTML Full-text | XML Full-text
Abstract
Reaction coordinates are vital tools for qualitative and quantitative analysis of molecular processes. They provide a simple picture of reaction progress and essential input for calculations of free energies and rates. Iso-committor surfaces are considered the optimal reaction coordinate. We present an algorithm
[...] Read more.
Reaction coordinates are vital tools for qualitative and quantitative analysis of molecular processes. They provide a simple picture of reaction progress and essential input for calculations of free energies and rates. Iso-committor surfaces are considered the optimal reaction coordinate. We present an algorithm to compute efficiently a sequence of isocommittor surfaces. These surfaces are considered an optimal reaction coordinate. The algorithm analyzes Milestoning results to determine the committor function. It requires only the transition probabilities between the milestones, and not transition times. We discuss the following numerical examples: (i) a transition in the Mueller potential; (ii) a conformational change of a solvated peptide; and (iii) cholesterol aggregation in membranes. Full article
(This article belongs to the Special Issue Understanding Molecular Dynamics via Stochastic Processes)
Figures

Figure 1

Open AccessArticle A Functorial Construction of Quantum Subtheories
Entropy 2017, 19(5), 220; doi:10.3390/e19050220
Received: 2 March 2017 / Revised: 3 May 2017 / Accepted: 4 May 2017 / Published: 11 May 2017
PDF Full-text (379 KB) | HTML Full-text | XML Full-text
Abstract
We apply the geometric quantization procedure via symplectic groupoids to the setting of epistemically-restricted toy theories formalized by Spekkens (Spekkens, 2016). In the continuous degrees of freedom, this produces the algebraic structure of quadrature quantum subtheories. In the odd-prime finite degrees of freedom,
[...] Read more.
We apply the geometric quantization procedure via symplectic groupoids to the setting of epistemically-restricted toy theories formalized by Spekkens (Spekkens, 2016). In the continuous degrees of freedom, this produces the algebraic structure of quadrature quantum subtheories. In the odd-prime finite degrees of freedom, we obtain a functor from the Frobenius algebra of the toy theories to the Frobenius algebra of stabilizer quantum mechanics. Full article
(This article belongs to the Special Issue Quantum Mechanics: From Foundations to Information Technologies)
Figures

Figure 1

Open AccessArticle Muscle Fatigue Analysis of the Deltoid during Three Head-Related Static Isometric Contraction Tasks
Entropy 2017, 19(5), 221; doi:10.3390/e19050221
Received: 27 March 2017 / Revised: 5 May 2017 / Accepted: 9 May 2017 / Published: 11 May 2017
PDF Full-text (2200 KB) | HTML Full-text | XML Full-text
Abstract
This study aimed to investigate the fatiguing characteristics of muscle-tendon units (MTUs) within skeletal muscles during static isometric contraction tasks. The deltoid was selected as the target muscle and three head-related static isometric contraction tasks were designed to activate three heads of the
[...] Read more.
This study aimed to investigate the fatiguing characteristics of muscle-tendon units (MTUs) within skeletal muscles during static isometric contraction tasks. The deltoid was selected as the target muscle and three head-related static isometric contraction tasks were designed to activate three heads of the deltoid in different modes. Nine male subjects participated in this study. Surface electromyography (SEMG) signals were collected synchronously from the three heads of the deltoid. The performances of five SEMG parameters, including root mean square (RMS), mean power frequency (MPF), the first coefficient of autoregressive model (ARC1), sample entropy (SE) and Higuchi’s fractal dimension (HFD), in quantification of fatigue, were evaluated in terms of sensitivity to variability ratio (SVR) and consistency firstly. Then, the HFD parameter was selected as the fatigue index for further muscle fatigue analysis. The experimental results demonstrated that the three deltoid heads presented different activation modes during three head-related fatiguing contractions. The fatiguing characteristics of the three heads were found to be task-dependent, and the heads kept in a relatively high activation level were more prone to fatigue. In addition, the differences in fatiguing rate between heads increased with the increase in load. The findings of this study can be helpful in better understanding the underlying neuromuscular control strategies of the central nervous system (CNS). Based on the results of this study, the CNS was thought to control the contraction of the deltoid by taking the three heads as functional units, but a certain synergy among heads might also exist to accomplish a contraction task. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Prediction and Evaluation of Zero Order Entropy Changes in Grammar-Based Codes
Entropy 2017, 19(5), 223; doi:10.3390/e19050223
Received: 30 January 2017 / Revised: 9 May 2017 / Accepted: 10 May 2017 / Published: 13 May 2017
PDF Full-text (398 KB) | HTML Full-text | XML Full-text
Abstract
The change of zero order entropy is studied over different strategies of grammar production rule selection. The two major rules are distinguished: transformations leaving the message size intact and substitution functions changing the message size. Relations for zero order entropy changes were derived
[...] Read more.
The change of zero order entropy is studied over different strategies of grammar production rule selection. The two major rules are distinguished: transformations leaving the message size intact and substitution functions changing the message size. Relations for zero order entropy changes were derived for both cases and conditions under which the entropy decreases were described. In this article, several different greedy strategies reducing zero order entropy, as well as message sizes are summarized, and the new strategy MinEnt is proposed. The resulting evolution of the zero order entropy is compared with a strategy of selecting the most frequent digram used in the Re-Pair algorithm. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle Classification of Fractal Signals Using Two-Parameter Non-Extensive Wavelet Entropy
Entropy 2017, 19(5), 224; doi:10.3390/e19050224
Received: 3 March 2017 / Revised: 28 April 2017 / Accepted: 9 May 2017 / Published: 15 May 2017
PDF Full-text (1217 KB) | HTML Full-text | XML Full-text
Abstract
This article proposes a methodology for the classification of fractal signals as stationary or nonstationary. The methodology is based on the theoretical behavior of two-parameter wavelet entropy of fractal signals. The wavelet (q,q)-entropy is a wavelet-based extension
[...] Read more.
This article proposes a methodology for the classification of fractal signals as stationary or nonstationary. The methodology is based on the theoretical behavior of two-parameter wavelet entropy of fractal signals. The wavelet ( q , q ) -entropy is a wavelet-based extension of the ( q , q ) -entropy of Borges and is based on the entropy planes for various q and q ; it is theoretically shown that it constitutes an efficient and effective technique for fractal signal classification. Moreover, the second parameter q provides further analysis flexibility and robustness in the sense that different ( q , q ) pairs can analyze the same phenomena and increase the range of dispersion of entropies. A comparison study against the standard signal summation conversion technique shows that the proposed methodology is not only comparable in accuracy but also more computationally efficient. The application of the proposed methodology to physiological and financial time series is also presented along with the classification of these as stationary or nonstationary. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory II)
Figures

Figure 1

Open AccessArticle Entropy Information of Cardiorespiratory Dynamics in Neonates during Sleep
Entropy 2017, 19(5), 225; doi:10.3390/e19050225
Received: 30 March 2017 / Revised: 11 May 2017 / Accepted: 12 May 2017 / Published: 15 May 2017
PDF Full-text (724 KB) | HTML Full-text | XML Full-text
Abstract
Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not
[...] Read more.
Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not yet fully understood. Signal processing approaches have focused on cardiorespiratory analysis to elucidate this co-regulation. This manuscript proposes to analyze heart rate (HR), respiratory variability and their interrelationship in newborn infants to characterize cardiorespiratory interactions in different sleep states (active vs. quiet). We are searching for indices that could detect regulation alteration or malfunction, potentially leading to infant distress. We have analyzed inter-beat (RR) interval series and respiration in a population of 151 newborns, and followed up with 33 at 1 month of age. RR interval series were obtained by recognizing peaks of the QRS complex in the electrocardiogram (ECG), corresponding to the ventricles depolarization. Univariate time domain, frequency domain and entropy measures were applied. In addition, Transfer Entropy was considered as a bivariate approach able to quantify the bidirectional information flow from one signal (respiration) to another (RR series). Results confirm the validity of the proposed approach. Overall, HRV is higher in active sleep, while high frequency (HF) power characterizes more quiet sleep. Entropy analysis provides higher indices for SampEn and Quadratic Sample entropy (QSE) in quiet sleep. Transfer Entropy values were higher in quiet sleep and point to a major influence of respiration on the RR series. At 1 month of age, time domain parameters show an increase in HR and a decrease in variability. No entropy differences were found across ages. The parameters employed in this study help to quantify the potential for infants to adapt their cardiorespiratory responses as they mature. Thus, they could be useful as early markers of risk for infant cardiorespiratory vulnerabilities. Full article
(This article belongs to the Special Issue Entropy and Sleep Disorders)
Figures

Figure 1

Open AccessArticle Information Entropy and Measures of Market Risk
Entropy 2017, 19(5), 226; doi:10.3390/e19050226
Received: 29 March 2017 / Revised: 8 May 2017 / Accepted: 11 May 2017 / Published: 16 May 2017
PDF Full-text (1818 KB) | HTML Full-text | XML Full-text
Abstract
In this paper we investigate the relationship between the information entropy of the distribution of intraday returns and intraday and daily measures of market risk. Using data on the EUR/JPY exchange rate, we find a negative relationship between entropy and intraday Value-at-Risk, and
[...] Read more.
In this paper we investigate the relationship between the information entropy of the distribution of intraday returns and intraday and daily measures of market risk. Using data on the EUR/JPY exchange rate, we find a negative relationship between entropy and intraday Value-at-Risk, and also between entropy and intraday Expected Shortfall. This relationship is then used to forecast daily Value-at-Risk, using the entropy of the distribution of intraday returns as a predictor. Full article
Figures

Figure 1

Open AccessArticle Ion Hopping and Constrained Li Diffusion Pathways in the Superionic State of Antifluorite Li2O
Entropy 2017, 19(5), 227; doi:10.3390/e19050227
Received: 21 March 2017 / Revised: 25 April 2017 / Accepted: 15 May 2017 / Published: 18 May 2017
PDF Full-text (2600 KB) | HTML Full-text | XML Full-text
Abstract
Li2O belongs to the family of antifluorites that show superionic behavior at high temperatures. While some of the superionic characteristics of Li2O are well-known, the mechanistic details of ionic conduction processes are somewhat nebulous. In this work, we first
[...] Read more.
Li2O belongs to the family of antifluorites that show superionic behavior at high temperatures. While some of the superionic characteristics of Li2O are well-known, the mechanistic details of ionic conduction processes are somewhat nebulous. In this work, we first establish an onset of superionic conduction that is emblematic of a gradual disordering process among the Li ions at a characteristic temperature Tα (~1000 K) using reported neutron diffraction data and atomistic simulations. In the superionic state, the Li ions are observed to portray dynamic disorder by hopping between the tetrahedral lattice sites. We then show that string-like ionic diffusion pathways are established among the Li ions in the superionic state. The diffusivity of these dynamical string-like structures, which have a finite lifetime, shows a remarkable correlation to the bulk diffusivity of the system. Full article
(This article belongs to the Special Issue Understanding Molecular Dynamics via Stochastic Processes)
Figures

Figure 1

Open AccessArticle Face Verification with Multi-Task and Multi-Scale Feature Fusion
Entropy 2017, 19(5), 228; doi:10.3390/e19050228
Received: 18 March 2017 / Revised: 5 May 2017 / Accepted: 13 May 2017 / Published: 17 May 2017
PDF Full-text (2416 KB) | HTML Full-text | XML Full-text
Abstract
Face verification for unrestricted faces in the wild is a challenging task. This paper proposes a method based on two deep convolutional neural networks (CNN) for face verification. In this work, we explore using identification signals to supervise one CNN and the combination
[...] Read more.
Face verification for unrestricted faces in the wild is a challenging task. This paper proposes a method based on two deep convolutional neural networks (CNN) for face verification. In this work, we explore using identification signals to supervise one CNN and the combination of semi-verification and identification to train the other one. In order to estimate semi-verification loss at a low computation cost, a circle, which is composed of all faces, is used for selecting face pairs from pairwise samples. In the process of face normalization, we propose using different landmarks of faces to solve the problems caused by poses. In addition, the final face representation is formed by the concatenating feature of each deep CNN after principal component analysis (PCA) reduction. Furthermore, each feature is a combination of multi-scale representations through making use of auxiliary classifiers. For the final verification, we only adopt the face representation of one region and one resolution of a face jointing Joint Bayesian classifier. Experiments show that our method can extract effective face representation with a small training dataset and our algorithm achieves 99.71% verification accuracy on Labeled Faces in the Wild (LFW) dataset. Full article
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science)
Figures

Figure 1

Open AccessArticle Investigation of the Intra- and Inter-Limb Muscle Coordination of Hands-and-Knees Crawling in Human Adults by Means of Muscle Synergy Analysis
Entropy 2017, 19(5), 229; doi:10.3390/e19050229
Received: 27 March 2017 / Revised: 25 April 2017 / Accepted: 15 May 2017 / Published: 17 May 2017
PDF Full-text (3987 KB) | HTML Full-text | XML Full-text
Abstract
To investigate the intra- and inter-limb muscle coordination mechanism of human hands-and-knees crawling by means of muscle synergy analysis, surface electromyographic (sEMG) signals of 20 human adults were collected bilaterally from 32 limb related muscles during crawling with hands and knees at different
[...] Read more.
To investigate the intra- and inter-limb muscle coordination mechanism of human hands-and-knees crawling by means of muscle synergy analysis, surface electromyographic (sEMG) signals of 20 human adults were collected bilaterally from 32 limb related muscles during crawling with hands and knees at different speeds. The nonnegative matrix factorization (NMF) algorithm was exerted on each limb to extract muscle synergies. The results showed that intra-limb coordination was relatively stable during human hands-and-knees crawling. Two synergies, one relating to the stance phase and the other relating to the swing phase, could be extracted from each limb during a crawling cycle. Synergy structures during different speeds kept good consistency, but the recruitment levels, durations, and phases of muscle synergies were adjusted to adapt the change of crawling speed. Furthermore, the ipsilateral phase lag (IPL) value which was used to depict the inter-limb coordination changed with crawling speed for most subjects, and subjects using the no-limb-pairing mode at low speed tended to adopt the trot-like mode or pace-like mode at high speed. The research results could be well explained by the two-level central pattern generator (CPG) model consisting of a half-center rhythm generator (RG) and a pattern formation (PF) circuit. This study sheds light on the underlying control mechanism of human crawling. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Specific and Complete Local Integration of Patterns in Bayesian Networks
Entropy 2017, 19(5), 230; doi:10.3390/e19050230
Received: 19 March 2017 / Revised: 11 May 2017 / Accepted: 12 May 2017 / Published: 18 May 2017
PDF Full-text (1119 KB) | HTML Full-text | XML Full-text
Abstract
We present a first formal analysis of specific and complete local integration. Complete local integration was previously proposed as a criterion for detecting entities or wholes in distributed dynamical systems. Such entities in turn were conceived to form the basis of a theory
[...] Read more.
We present a first formal analysis of specific and complete local integration. Complete local integration was previously proposed as a criterion for detecting entities or wholes in distributed dynamical systems. Such entities in turn were conceived to form the basis of a theory of emergence of agents within dynamical systems. Here, we give a more thorough account of the underlying formal measures. The main contribution is the disintegration theorem which reveals a special role of completely locally integrated patterns (what we call ι-entities) within the trajectories they occur in. Apart from proving this theorem we introduce the disintegration hierarchy and its refinement-free version as a way to structure the patterns in a trajectory. Furthermore, we construct the least upper bound and provide a candidate for the greatest lower bound of specific local integration. Finally, we calculate the ι -entities in small example systems as a first sanity check and find that ι -entities largely fulfil simple expectations. Full article
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
Figures

Figure 1

Open AccessArticle A Novel Faults Diagnosis Method for Rolling Element Bearings Based on EWT and Ambiguity Correlation Classifiers
Entropy 2017, 19(5), 231; doi:10.3390/e19050231
Received: 20 March 2017 / Revised: 6 May 2017 / Accepted: 15 May 2017 / Published: 18 May 2017
PDF Full-text (3529 KB) | HTML Full-text | XML Full-text
Abstract
According to non-stationary characteristic of the acoustic emission signal of rolling element bearings, a novel fault diagnosis method based on empirical wavelet transform (EWT) and ambiguity correlation classification (ACC) is proposed. In the proposed method, the acoustic emission signal acquired from a one-channel
[...] Read more.
According to non-stationary characteristic of the acoustic emission signal of rolling element bearings, a novel fault diagnosis method based on empirical wavelet transform (EWT) and ambiguity correlation classification (ACC) is proposed. In the proposed method, the acoustic emission signal acquired from a one-channel sensor is firstly decomposed using the EWT method, and then the mutual information of decomposed components and the original signal is computed and used to extract the noiseless component in order to obtain the reconstructed signal. Afterwards, the ambiguity correlation classifier, which has the advantages of ambiguity functions in the processing of the non-stationary signal, and the combining of correlation coefficients, is applied. Finally, multiple datasets of reconstructed signals for different operative conditions are fed to the ambiguity correlation classifier for training and testing. The proposed method was verified by experiments, and experimental results have shown that the proposed method can effectively diagnose three different operative conditions of rolling element bearings with higher detection rates than support vector machine and back-propagation (BP) neural network algorithms. Full article
(This article belongs to the Section Information Theory)
Figures

Open AccessArticle A Kullback–Leibler View of Maximum Entropy and Maximum Log-Probability Methods
Entropy 2017, 19(5), 232; doi:10.3390/e19050232
Received: 2 March 2017 / Revised: 30 April 2017 / Accepted: 15 May 2017 / Published: 19 May 2017
PDF Full-text (940 KB) | HTML Full-text | XML Full-text
Abstract
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes
[...] Read more.
Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback–Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle On Linear Coding over Finite Rings and Applications to Computing
Entropy 2017, 19(5), 233; doi:10.3390/e19050233
Received: 6 January 2017 / Revised: 24 April 2017 / Accepted: 15 May 2017 / Published: 20 May 2017
PDF Full-text (436 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents a coding theorem for linear coding over finite rings, in the setting of the Slepian–Wolf source coding problem. This theorem covers corresponding achievability theorems of Elias (IRE Conv. Rec. 1955, 3, 37–46) and Csiszár (IEEE Trans. Inf. Theory
[...] Read more.
This paper presents a coding theorem for linear coding over finite rings, in the setting of the Slepian–Wolf source coding problem. This theorem covers corresponding achievability theorems of Elias (IRE Conv. Rec. 1955, 3, 37–46) and Csiszár (IEEE Trans. Inf. Theory 1982, 28, 585–592) for linear coding over finite fields as special cases. In addition, it is shown that, for any set of finite correlated discrete memoryless sources, there always exists a sequence of linear encoders over some finite non-field rings which achieves the data compression limit, the Slepian–Wolf region. Hence, the optimality problem regarding linear coding over finite non-field rings for data compression is closed with positive confirmation with respect to existence. For application, we address the problem of source coding for computing, where the decoder is interested in recovering a discrete function of the data generated and independently encoded by several correlated i.i.d. random sources. We propose linear coding over finite rings as an alternative solution to this problem. Results in Körner–Marton (IEEE Trans. Inf. Theory 1979, 25, 219–221) and Ahlswede–Han (IEEE Trans. Inf. Theory 1983, 29, 396–411, Theorem 10) are generalized to cases for encoding (pseudo) nomographic functions (over rings). Since a discrete function with a finite domain always admits a nomographic presentation, we conclude that both generalizations universally apply for encoding all discrete functions of finite domains. Based on these, we demonstrate that linear coding over finite rings strictly outperforms its field counterpart in terms of achieving better coding rates and reducing the required alphabet sizes of the encoders for encoding infinitely many discrete functions. Full article
(This article belongs to the Special Issue Network Information Theory)
Figures

Figure 1

Open AccessArticle The Particle as a Statistical Ensemble of Events in Stueckelberg–Horwitz–Piron Electrodynamics
Entropy 2017, 19(5), 234; doi:10.3390/e19050234
Received: 8 March 2017 / Revised: 15 May 2017 / Accepted: 17 May 2017 / Published: 19 May 2017
PDF Full-text (263 KB) | HTML Full-text | XML Full-text
Abstract
In classical Maxwell electrodynamics, charged particles following deterministic trajectories are described by currents that induce fields, mediating interactions with other particles. Statistical methods are used when needed to treat complex particle and/or field configurations. In Stueckelberg–Horwitz–Piron (SHP) electrodynamics, the classical trajectories are traced
[...] Read more.
In classical Maxwell electrodynamics, charged particles following deterministic trajectories are described by currents that induce fields, mediating interactions with other particles. Statistical methods are used when needed to treat complex particle and/or field configurations. In Stueckelberg–Horwitz–Piron (SHP) electrodynamics, the classical trajectories are traced out dynamically, through the evolution of a 4D spacetime event x μ ( τ ) as τ grows monotonically. Stueckelberg proposed to formalize the distinction between coordinate time x 0 = c t (measured by laboratory clocks) and chronology τ (the temporal ordering of event occurrence) in order to describe antiparticles and resolve problems of irreversibility such as grandfather paradoxes. Consequently, in SHP theory, the elementary object is not a particle (a 4D curve in spacetime) but rather an event (a single point along the dynamically evolving curve). Following standard deterministic methods in classical relativistic field theory, one is led to Maxwell-like field equations that are τ -dependent and sourced by a current that represents a statistical ensemble of instantaneous events distributed along the trajectory. The width λ of this distribution defines a correlation time for the interactions and a mass spectrum for the photons emitted by particles. As λ becomes very large, the photon mass goes to zero and the field equations become τ -independent Maxwell’s equations. Maxwell theory thus emerges as an equilibrium limit of SHP, in which λ is larger than any other relevant time scale. Thus, statistical mechanics is a fundamental ingredient in SHP electrodynamics, and its insights are required to give meaning to the concept of a particle. Full article
(This article belongs to the Section Statistical Mechanics)
Open AccessArticle Entropy in Investigation of Vasovagal Syndrome in Passive Head Up Tilt Test
Entropy 2017, 19(5), 236; doi:10.3390/e19050236
Received: 1 March 2017 / Revised: 15 May 2017 / Accepted: 16 May 2017 / Published: 20 May 2017
PDF Full-text (1984 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents an application of Approximate Entropy (ApEn) and Sample Entropy (SampEn) in the analysis of heart rhythm, blood pressure and stroke volume for the diagnosis of vasovagal syndrome. The analyzed biosignals were recorded during positive passive tilt tests—HUTT(+). Signal changes and
[...] Read more.
This paper presents an application of Approximate Entropy (ApEn) and Sample Entropy (SampEn) in the analysis of heart rhythm, blood pressure and stroke volume for the diagnosis of vasovagal syndrome. The analyzed biosignals were recorded during positive passive tilt tests—HUTT(+). Signal changes and their entropy were compared in three main phases of the test: supine position, tilt, and pre-syncope, with special focus on the latter, which was analyzed in a sliding window of each signal. In some cases, ApEn and SampEn were equally useful for the assessment of signal complexity (p < 0.05 in corresponding calculations). The complexity of the signals was found to decrease in the pre-syncope phase (SampEn (RRI): 1.20–0.34, SampEn (sBP): 1.29–0.57, SampEn (dBP): 1.19–0.48, SampEn (SV): 1.62–0.91). The pattern of the SampEn (SV) decrease differs from the pattern of the SampEn (sBP), SampEn (dBP) and SampEn (RRI) decrease. For all signals, the lowest entropy values in the pre-syncope phase were observed at the moment when loss of consciousness occurred. Full article
(This article belongs to the Special Issue Entropy and Cardiac Physics II)
Figures

Figure 1

Open AccessArticle Can a Robot Have Free Will?
Entropy 2017, 19(5), 237; doi:10.3390/e19050237
Received: 27 February 2017 / Revised: 28 April 2017 / Accepted: 15 May 2017 / Published: 20 May 2017
PDF Full-text (552 KB) | HTML Full-text | XML Full-text
Abstract
Using insights from cybernetics and an information-based understanding of biological systems, a precise, scientifically inspired, definition of free-will is offered and the essential requirements for an agent to possess it in principle are set out. These are: (a) there must be a self
[...] Read more.
Using insights from cybernetics and an information-based understanding of biological systems, a precise, scientifically inspired, definition of free-will is offered and the essential requirements for an agent to possess it in principle are set out. These are: (a) there must be a self to self-determine; (b) there must be a non-zero probability of more than one option being enacted; (c) there must be an internal means of choosing among options (which is not merely random, since randomness is not a choice). For (a) to be fulfilled, the agent of self-determination must be organisationally closed (a “Kantian whole”). For (c) to be fulfilled: (d) options must be generated from an internal model of the self which can calculate future states contingent on possible responses; (e) choosing among these options requires their evaluation using an internally generated goal defined on an objective function representing the overall “master function” of the agent and (f) for “deep free-will”, at least two nested levels of choice and goal (d–e) must be enacted by the agent. The agent must also be able to enact its choice in physical reality. The only systems known to meet all these criteria are living organisms, not just humans, but a wide range of organisms. The main impediment to free-will in present-day artificial robots, is their lack of being a Kantian whole. Consciousness does not seem to be a requirement and the minimum complexity for a free-will system may be quite low and include relatively simple life-forms that are at least able to learn. Full article
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
Figures

Figure 1

Open AccessArticle Lyapunov Spectra of Coulombic and Gravitational Periodic Systems
Entropy 2017, 19(5), 238; doi:10.3390/e19050238
Received: 2 April 2017 / Revised: 15 May 2017 / Accepted: 15 May 2017 / Published: 20 May 2017
PDF Full-text (545 KB) | HTML Full-text | XML Full-text
Abstract
An open question in nonlinear dynamics is the relation between the Kolmogorov entropy and the largest Lyapunov exponent of a given orbit. Both have been shown to have diagnostic capability for phase transitions in thermodynamic systems. For systems with long-range interactions, the choice
[...] Read more.
An open question in nonlinear dynamics is the relation between the Kolmogorov entropy and the largest Lyapunov exponent of a given orbit. Both have been shown to have diagnostic capability for phase transitions in thermodynamic systems. For systems with long-range interactions, the choice of boundary plays a critical role and appropriate boundary conditions must be invoked. In this work, we compute Lyapunov spectra for Coulombic and gravitational versions of the one-dimensional systems of parallel sheets with periodic boundary conditions. Exact expressions for time evolution of the tangent-space vectors are derived and are utilized toward computing Lypaunov characteristic exponents using an event-driven algorithm. The results indicate that the energy dependence of the largest Lyapunov exponent emulates that of Kolmogorov entropy for each system for a given system size. Our approach forms an effective and approximation-free instrument for studying the dynamical properties exhibited by the Coulombic and gravitational systems and finds applications in investigating indications of thermodynamic transitions in small as well as large versions of the spatially periodic systems. When a phase transition exists, we find that the largest Lyapunov exponent serves as a precursor of the transition that becomes more pronounced as the system size increases. Full article
(This article belongs to the Special Issue Thermodynamics and Statistical Mechanics of Small Systems)
Figures

Figure 1

Open AccessArticle Assessing Catchment Resilience Using Entropy Associated with Mean Annual Runoff for the Upper Vaal Catchment in South Africa
Entropy 2017, 19(5), 147; doi:10.3390/e19050147
Received: 19 October 2016 / Revised: 2 March 2017 / Accepted: 20 March 2017 / Published: 27 April 2017
PDF Full-text (2665 KB) | HTML Full-text | XML Full-text
Abstract
The importance of the mean annual runoff (MAR)-hydrological variable is paramount for catchment planning, development and management. MAR depicts the amount of uncertainty or chaos (implicitly information content) of the catchment. The uncertainty associated with MAR of quaternary catchments (QCs) in the Upper
[...] Read more.
The importance of the mean annual runoff (MAR)-hydrological variable is paramount for catchment planning, development and management. MAR depicts the amount of uncertainty or chaos (implicitly information content) of the catchment. The uncertainty associated with MAR of quaternary catchments (QCs) in the Upper Vaal catchment of South Africa has been quantified through Shannon entropy. As a result of chaos over a period of time, the hydrological catchment behavior/response in terms of MAR could be characterized by its resilience. Uncertainty (chaos) in QCs was used as a surrogate measure of catchment resilience. MAR data on surface water resources (WR) of South Africa of 1990 (i.e., WR90), 2005 (WR2005) and 2012 (W2012) were used in this study. A linear zoning for catchment resilience in terms of water resources sustainability was defined. Regression models (with high correlation) between the relative changes/variations in MAR data sets and relative changes in entropy were established, for WR2005 and WR2012. These models were compared with similar relationships for WR90 and WR2005, previously reported. The MAR pseudo-elasticity of the uncertainty associated with MAR was derived from regression models to characterize the resilience state of QCs. The MAR pseudo-elasticity values were relatively small to have an acceptable level of catchment resilience in the Upper Vaal catchment. Within the resilience zone, it was also shown that the effect of mean annual evaporation (MAE) was negatively significant on MAR pseudo-elasticity, compared to the effect of mean annual precipitation (MAP), which was positively insignificant. Full article
(This article belongs to the Special Issue Entropy for Sustainable and Resilient Urban Future)
Figures

Figure 1

Open AccessArticle En-LDA: An Novel Approach to Automatic Bug Report Assignment with Entropy Optimized Latent Dirichlet Allocation
Entropy 2017, 19(5), 173; doi:10.3390/e19050173
Received: 6 February 2017 / Revised: 13 April 2017 / Accepted: 14 April 2017 / Published: 25 April 2017
PDF Full-text (362 KB) | HTML Full-text | XML Full-text
Abstract
With the increasing number of bug reports coming into the open bug repository, it is impossible to triage bug reports manually by software managers. This paper proposes a novel approach called En-LDA (Entropy optimized Latent Dirichlet Allocation (LDA)) for automatic bug report assignment.
[...] Read more.
With the increasing number of bug reports coming into the open bug repository, it is impossible to triage bug reports manually by software managers. This paper proposes a novel approach called En-LDA (Entropy optimized Latent Dirichlet Allocation (LDA)) for automatic bug report assignment. Specifically, we propose entropy to optimize the number of topics of the LDA model and further use the entropy optimized LDA to capture the expertise and interest of developers in bug resolution. A developer’s interest in a topic is modeled by the number of the developer’s comments on bug reports of the topic divided by the number of all the developer’s comments. A developer’s expertise in a topic is modeled by the number of the developer’s comments on bug reports of the topic divided by the number of all developers’ comments on the topic. Given a new bug report, En-LDA recommends a ranked list of developers who are potentially adequate to resolve the new bug. Experiments on Eclipse JDT and Mozilla Firefox projects show that En-LDA can achieve high recall up to 84% and 58%, and precision up to 28% and 41%, respectively, which indicates promising aspects of the proposed approach. Full article
Figures

Figure 1

Open AccessArticle Recognition of Traveling Surges in HVDC with Wavelet Entropy
Entropy 2017, 19(5), 184; doi:10.3390/e19050184
Received: 4 February 2017 / Revised: 5 April 2017 / Accepted: 21 April 2017 / Published: 26 April 2017
PDF Full-text (1338 KB) | HTML Full-text | XML Full-text
Abstract
Traveling surges are commonly adopted in protection devices of high-voltage direct current (HVDC) transmission systems. Lightning strikes also can produce large-amplitude traveling surges which lead to the malfunction of relays. To ensure the reliable operation of protection devices, recognition of traveling surges must
[...] Read more.
Traveling surges are commonly adopted in protection devices of high-voltage direct current (HVDC) transmission systems. Lightning strikes also can produce large-amplitude traveling surges which lead to the malfunction of relays. To ensure the reliable operation of protection devices, recognition of traveling surges must be considered. Wavelet entropy, which can reveal time-frequency distribution features, is a potential tool for traveling surge recognition. In this paper, the effectiveness of wavelet entropy in characterizing traveling surges is demonstrated by comparing its representations of different kinds of surges and discussing its stability with the effects of propagation distance and fault resistance. A wavelet entropy-based recognition method is proposed and tested by simulated traveling surges. The results show wavelet entropy can discriminate fault traveling surges with a good recognition rate. Full article
(This article belongs to the Special Issue Entropy in Signal Analysis)
Figures

Figure 1

Open AccessArticle A Quantum Description of the Stern–Gerlach Experiment
Entropy 2017, 19(5), 186; doi:10.3390/e19050186
Received: 16 February 2017 / Revised: 19 April 2017 / Accepted: 20 April 2017 / Published: 25 April 2017
PDF Full-text (1002 KB) | HTML Full-text | XML Full-text
Abstract
A detailed analysis of the classic Stern–Gerlach experiment is presented. An analytical simple solution is presented for the quantum description of the translational and spin dynamics of a silver atom in a magnetic field with a gradient along a single z-direction. This
[...] Read more.
A detailed analysis of the classic Stern–Gerlach experiment is presented. An analytical simple solution is presented for the quantum description of the translational and spin dynamics of a silver atom in a magnetic field with a gradient along a single z-direction. This description is then used to obtain an approximate quantum description of the more realistic case with a magnetic field gradient also in a second y-direction. An explicit relation is derived for how an initial off center deviation in the y-direction affects the final result observed at the detector. This shows that the “mouth shape” pattern at the detector observed in the original Stern–Gerlach experiment is a generic consequence of the gradient in the y-direction. This is followed by a discussion of the spin dynamics during the entry of the silver atom into the magnet. An analytical relation is derived for a simplified case of a field only along the z-direction. A central question for the conceptual understanding of the Stern–Gerlach experiment has been how an initially unpolarized spin ends up in a polarized state at the detector. It is argued that this can be understood with the use of the adiabatic approximation. When the atoms first experience the magnetic field outside the magnet, there is in general a change in the spin state, which transforms from a degenerate eigenstate in the absence of a field into one of two possible non-degenerate states in the field. If the direction of the field changes during the passage through the device, there is a corresponding adiabatic change of the spin state. It is shown that an application of the adiabatic approximation in this way is consistent with the previously derived exact relations. Full article
(This article belongs to the Special Issue Foundations of Quantum Mechanics)
Figures

Figure 1

Open AccessArticle Multicomponent and Longitudinal Imaging Seen as a Communication Channel—An Application to Stroke
Entropy 2017, 19(5), 187; doi:10.3390/e19050187
Received: 10 March 2017 / Revised: 18 April 2017 / Accepted: 24 April 2017 / Published: 26 April 2017
PDF Full-text (1140 KB) | HTML Full-text | XML Full-text
Abstract
In longitudinal medical studies, multicomponent images of the tissues, acquired at a given stage of a disease, are used to provide information on the fate of the tissues. We propose a quantification of the predictive value of multicomponent images using information theory. To
[...] Read more.
In longitudinal medical studies, multicomponent images of the tissues, acquired at a given stage of a disease, are used to provide information on the fate of the tissues. We propose a quantification of the predictive value of multicomponent images using information theory. To this end, we revisit the predictive information introduced for monodimensional time series and extend it to multicomponent images. The interest of this theoretical approach is illustrated on multicomponent magnetic resonance images acquired on stroke patients at acute and late stages, for which we propose an original and realistic model of noise together with a spatial encoding for the images. We address therefrom very practical questions such as the impact of noise on the predictability, the optimal choice of an observation scale and the predictability gain brought by the addition of imaging components. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle When the Map Is Better Than the Territory
Entropy 2017, 19(5), 188; doi:10.3390/e19050188
Received: 13 March 2017 / Revised: 17 April 2017 / Accepted: 21 April 2017 / Published: 26 April 2017
PDF Full-text (1815 KB) | HTML Full-text | XML Full-text
Abstract
The causal structure of any system can be analyzed at a multitude of spatial and temporal scales. It has long been thought that while higher scale (macro) descriptions may be useful to observers, they are at best a compressed description and at worse
[...] Read more.
The causal structure of any system can be analyzed at a multitude of spatial and temporal scales. It has long been thought that while higher scale (macro) descriptions may be useful to observers, they are at best a compressed description and at worse leave out critical information and causal relationships. However, recent research applying information theory to causal analysis has shown that the causal structure of some systems can actually come into focus and be more informative at a macroscale. That is, a macroscale description of a system (a map) can be more informative than a fully detailed microscale description of the system (the territory). This has been called “causal emergence.” While causal emergence may at first seem counterintuitive, this paper grounds the phenomenon in a classic concept from information theory: Shannon’s discovery of the channel capacity. I argue that systems have a particular causal capacity, and that different descriptions of those systems take advantage of that capacity to various degrees. For some systems, only macroscale descriptions use the full causal capacity. These macroscales can either be coarse-grains, or may leave variables and states out of the model (exogenous, or “black boxed”) in various ways, which can improve the efficacy and informativeness via the same mathematical principles of how error-correcting codes take advantage of an information channel’s capacity. The causal capacity of a system can approach the channel capacity as more and different kinds of macroscales are considered. Ultimately, this provides a general framework for understanding how the causal structure of some systems cannot be fully captured by even the most detailed microscale description. Full article
(This article belongs to the Section Complexity)
Figures

Figure 1

Open AccessArticle Entropy-Based Parameter Estimation for the Four-Parameter Exponential Gamma Distribution
Entropy 2017, 19(5), 189; doi:10.3390/e19050189
Received: 6 March 2017 / Revised: 4 April 2017 / Accepted: 21 April 2017 / Published: 26 April 2017
PDF Full-text (287 KB) | HTML Full-text | XML Full-text
Abstract
Two methods based on the principle of maximum entropy (POME), the ordinary entropy method (ENT) and the parameter space expansion method (PSEM), are developed for estimating the parameters of a four-parameter exponential gamma distribution. Using six data sets for annual precipitation at the
[...] Read more.
Two methods based on the principle of maximum entropy (POME), the ordinary entropy method (ENT) and the parameter space expansion method (PSEM), are developed for estimating the parameters of a four-parameter exponential gamma distribution. Using six data sets for annual precipitation at the Weihe River basin in China, the PSEM was applied for estimating parameters for the four-parameter exponential gamma distribution and was compared to the methods of moments (MOM) and of maximum likelihood estimation (MLE). It is shown that PSEM enables the four-parameter exponential distribution to fit the data well, and can further improve the estimation. Full article
(This article belongs to the Special Issue Entropy Applications in Environmental and Water Engineering)
Open AccessArticle On the Anonymity Risk of Time-Varying User Profiles
Entropy 2017, 19(5), 190; doi:10.3390/e19050190
Received: 3 March 2017 / Revised: 12 April 2017 / Accepted: 24 April 2017 / Published: 26 April 2017
PDF Full-text (1055 KB) | HTML Full-text | XML Full-text
Abstract
Websites and applications use personalisation services to profile their users, collect their patterns and activities and eventually use this data to provide tailored suggestions. User preferences and social interactions are therefore aggregated and analysed. Every time a user publishes a new post or
[...] Read more.
Websites and applications use personalisation services to profile their users, collect their patterns and activities and eventually use this data to provide tailored suggestions. User preferences and social interactions are therefore aggregated and analysed. Every time a user publishes a new post or creates a link with another entity, either another user, or some online resource, new information is added to the user profile. Exposing private data does not only reveal information about single users’ preferences, increasing their privacy risk, but can expose more about their network that single actors intended. This mechanism is self-evident in social networks where users receive suggestions based on their friends’ activities. We propose an information-theoretic approach to measure the differential update of the anonymity risk of time-varying user profiles. This expresses how privacy is affected when new content is posted and how much third-party services get to know about the users when a new activity is shared. We use actual Facebook data to show how our model can be applied to a real-world scenario. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle Image Bi-Level Thresholding Based on Gray Level-Local Variance Histogram
Entropy 2017, 19(5), 191; doi:10.3390/e19050191
Received: 22 March 2017 / Revised: 21 April 2017 / Accepted: 24 April 2017 / Published: 26 April 2017
PDF Full-text (955 KB) | HTML Full-text | XML Full-text
Abstract
Thresholding is a popular method of image segmentation. Many thresholding methods utilize only the gray level information of pixels in the image, which may lead to poor segmentation performance because the spatial correlation information between pixels is ignored. To improve the performance of
[...] Read more.
Thresholding is a popular method of image segmentation. Many thresholding methods utilize only the gray level information of pixels in the image, which may lead to poor segmentation performance because the spatial correlation information between pixels is ignored. To improve the performance of thresolding methods, a novel two-dimensional histogram—called gray level-local variance (GLLV) histogram—is proposed in this paper as an entropic thresholding method to segment images with bimodal histograms. The GLLV histogram is constructed by using the gray level information of pixels and its local variance in a neighborhood. Local variance measures the dispersion of gray level distribution of pixels in a neighborhood. If a pixel’s gray level is close to its neighboring pixels, its local variance is small, and vice versa. Therefore, local variance can reflect the spatial information between pixels. The GLLV histogram takes not only the gray level, but also the spatial information into consideration. Experimental results show that an entropic thresholding method based on the GLLV histogram can achieve better segmentation performance. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle Entropy in the Tangled Nature Model of Evolution
Entropy 2017, 19(5), 192; doi:10.3390/e19050192
Received: 25 February 2017 / Revised: 11 April 2017 / Accepted: 19 April 2017 / Published: 27 April 2017
PDF Full-text (848 KB) | HTML Full-text | XML Full-text
Abstract
Applications of entropy principles to evolution and ecology are of tantamount importance given the central role spatiotemporal structuring plays in both evolution and ecological succession. We obtain here a qualitative interpretation of the role of entropy in evolving ecological systems. Our interpretation is
[...] Read more.
Applications of entropy principles to evolution and ecology are of tantamount importance given the central role spatiotemporal structuring plays in both evolution and ecological succession. We obtain here a qualitative interpretation of the role of entropy in evolving ecological systems. Our interpretation is supported by mathematical arguments using simulation data generated by the Tangled Nature Model (TNM), a stochastic model of evolving ecologies. We define two types of configurational entropy and study their empirical time dependence obtained from the data. Both entropy measures increase logarithmically with time, while the entropy per individual decreases in time, in parallel with the growth of emergent structures visible from other aspects of the simulation. We discuss the biological relevance of these entropies to describe niche space and functional space of ecosystems, as well as their use in characterizing the number of taxonomic configurations compatible with different niche partitioning and functionality. The TNM serves as an illustrative example of how to calculate and interpret these entropies, which are, however, also relevant to real ecosystems, where they can be used to calculate the number of functional and taxonomic configurations that an ecosystem can realize. Full article
(This article belongs to the Special Issue Entropy in Landscape Ecology)
Figures

Figure 1

Open AccessArticle Stochastic Stirling Engine Operating in Contact with Active Baths
Entropy 2017, 19(5), 193; doi:10.3390/e19050193
Received: 17 March 2017 / Revised: 10 April 2017 / Accepted: 21 April 2017 / Published: 27 April 2017
PDF Full-text (301 KB) | HTML Full-text | XML Full-text
Abstract
A Stirling engine made of a colloidal particle in contact with a nonequilibrium bath is considered and analyzed with the tools of stochastic energetics. We model the bath by non Gaussian persistent noise acting on the colloidal particle. Depending on the chosen definition
[...] Read more.
A Stirling engine made of a colloidal particle in contact with a nonequilibrium bath is considered and analyzed with the tools of stochastic energetics. We model the bath by non Gaussian persistent noise acting on the colloidal particle. Depending on the chosen definition of an isothermal transformation in this nonequilibrium setting, we find that either the energetics of the engine parallels that of its equilibrium counterpart or, in the simplest case, that it ends up being less efficient. Persistence, more than non-Gaussian effects, are responsible for this result. Full article
(This article belongs to the Special Issue Thermodynamics and Statistical Mechanics of Small Systems)
Figures

Figure 1

Open AccessArticle Criticality and Information Dynamics in Epidemiological Models
Entropy 2017, 19(5), 194; doi:10.3390/e19050194
Received: 2 March 2017 / Revised: 24 April 2017 / Accepted: 25 April 2017 / Published: 27 April 2017
PDF Full-text (310 KB) | HTML Full-text | XML Full-text
Abstract
Understanding epidemic dynamics has always been a challenge. As witnessed from the ongoing Zika or the seasonal Influenza epidemics, we still need to improve our analytical methods to better understand and control epidemics. While the emergence of complex sciences in the turn of
[...] Read more.
Understanding epidemic dynamics has always been a challenge. As witnessed from the ongoing Zika or the seasonal Influenza epidemics, we still need to improve our analytical methods to better understand and control epidemics. While the emergence of complex sciences in the turn of the millennium have resulted in their implementation in modelling epidemics, there is still a need for improving our understanding of critical dynamics in epidemics. In this study, using agent-based modelling, we simulate a Susceptible-Infected-Susceptible (SIS) epidemic on a homogeneous network. We use transfer entropy and active information storage from information dynamics framework to characterise the critical transition in epidemiological models. Our study shows that both (bias-corrected) transfer entropy and active information storage maximise after the critical threshold ( R 0 = 1). This is the first step toward an information dynamics approach to epidemics. Understanding the dynamics around the criticality in epidemiological models can provide us insights about emergent diseases and disease control. Full article
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
Figures

Figure 1

Open AccessArticle Symbolic Analysis of Brain Dynamics Detects Negative Stress
Entropy 2017, 19(5), 196; doi:10.3390/e19050196
Received: 3 March 2017 / Revised: 23 April 2017 / Accepted: 26 April 2017 / Published: 28 April 2017
PDF Full-text (5023 KB) | HTML Full-text | XML Full-text
Abstract
The electroencephalogram (EEG) is the most common tool used to study mental disorders. In the last years, the use of this recording for recognition of negative stress has been receiving growing attention. However, precise identification of this emotional state is still an interesting
[...] Read more.
The electroencephalogram (EEG) is the most common tool used to study mental disorders. In the last years, the use of this recording for recognition of negative stress has been receiving growing attention. However, precise identification of this emotional state is still an interesting unsolved challenge. Nowadays, stress presents a high prevalence in developed countries and, moreover, its chronic condition often leads to concomitant physical and mental health problems. Recently, a measure of time series irregularity, such as quadratic sample entropy (QSEn), has been suggested as a promising single index for discerning between emotions of calm and stress. Unfortunately, this index only considers repetitiveness of similar patterns and, hence, it is unable to quantify successfully dynamics associated with the data temporal structure. With the aim of extending QSEn ability for identification of stress from the EEG signal, permutation entropy (PEn) and its modification to be amplitude-aware (AAPEn) have been analyzed in the present work. These metrics assess repetitiveness of ordinal patterns, thus considering causal information within each one of them and obtaining improved estimates of predictability. Results have shown that PEn and AAPEn present a discriminant power between emotional states of calm and stress similar to QSEn, i.e., around 65%. Additionally, they have also revealed complementary dynamics to those quantified by QSEn, thus suggesting a synchronized behavior between frontal and parietal counterparts from both hemispheres of the brain. More precisely, increased stress levels have resulted in activation of the left frontal and right parietal regions and, simultaneously, in relaxing of the right frontal and left parietal areas. Taking advantage of this brain behavior, a discriminant model only based on AAPEn and QSEn computed from the EEG channels P3 and P4 has reached a diagnostic accuracy greater than 80%, which improves slightly the current state of the art. Moreover, because this classification system is notably easier than others previously proposed, it could be used for continuous monitoring of negative stress, as well as for its regulation towards more positive moods in controlled environments. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications)
Figures

Figure 1

Open AccessArticle A New Kind of Permutation Entropy Used to Classify Sleep Stages from Invisible EEG Microstructure
Entropy 2017, 19(5), 197; doi:10.3390/e19050197
Received: 31 March 2017 / Revised: 21 April 2017 / Accepted: 26 April 2017 / Published: 28 April 2017
PDF Full-text (607 KB) | HTML Full-text | XML Full-text
Abstract
Permutation entropy and order patterns in an EEG signal have been applied by several authors to study sleep, anesthesia, and epileptic absences. Here, we discuss a new version of permutation entropy, which is interpreted as distance to white noise. It has a scale
[...] Read more.
Permutation entropy and order patterns in an EEG signal have been applied by several authors to study sleep, anesthesia, and epileptic absences. Here, we discuss a new version of permutation entropy, which is interpreted as distance to white noise. It has a scale similar to the well-known χ 2 distributions and can be supported by a statistical model. Critical values for significance are provided. Distance to white noise is used as a parameter which measures depth of sleep, where the vigilant awake state of the human EEG is interpreted as “almost white noise”. Classification of sleep stages from EEG data usually relies on delta waves and graphic elements, which can be seen on a macroscale of several seconds. The distance to white noise can anticipate such emerging waves before they become apparent, evaluating invisible tendencies of variations within 40 milliseconds. Data segments of 30 s of high-resolution EEG provide a reliable classification. Application to the diagnosis of sleep disorders is indicated. Full article
(This article belongs to the Special Issue Entropy and Sleep Disorders)
Figures

Figure 1

Open AccessArticle Cooperative Particle Filtering for Tracking ERP Subcomponents from Multichannel EEG
Entropy 2017, 19(5), 199; doi:10.3390/e19050199
Received: 12 January 2017 / Revised: 12 April 2017 / Accepted: 23 April 2017 / Published: 29 April 2017
PDF Full-text (5512 KB) | HTML Full-text | XML Full-text
Abstract
In this study, we propose a novel method to investigate P300 variability over different trials. The method incorporates spatial correlation between EEG channels to form a cooperative coupled particle filtering method that tracks the P300 subcomponents, P3a and P3b, over trials. Using state
[...] Read more.
In this study, we propose a novel method to investigate P300 variability over different trials. The method incorporates spatial correlation between EEG channels to form a cooperative coupled particle filtering method that tracks the P300 subcomponents, P3a and P3b, over trials. Using state space systems, the amplitude, latency, and width of each subcomponent are modeled as the main underlying parameters. With four electrodes, two coupled Rao-Blackwellised particle filter pairs are used to recursively estimate the system state over trials. A number of physiological constraints are also imposed to avoid generating invalid particles in the estimation process. Motivated by the bilateral symmetry of ERPs over the brain, the channels further share their estimates with their neighbors and combine the received information to obtain a more accurate and robust solution. The proposed algorithm is capable of estimating the P300 subcomponents in single trials and outperforms its non-cooperative counterpart. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Figures

Figure 1

Other

Jump to: Research

Open AccessConcept Paper About the Concept of Quantum Chaos
Entropy 2017, 19(5), 205; doi:10.3390/e19050205
Received: 5 February 2017 / Revised: 22 April 2017 / Accepted: 23 April 2017 / Published: 3 May 2017
PDF Full-text (364 KB) | HTML Full-text | XML Full-text
Abstract
The research on quantum chaos finds its roots in the study of the spectrum of complex nuclei in the 1950s and the pioneering experiments in microwave billiards in the 1970s. Since then, a large number of new results was produced. Nevertheless, the work
[...] Read more.
The research on quantum chaos finds its roots in the study of the spectrum of complex nuclei in the 1950s and the pioneering experiments in microwave billiards in the 1970s. Since then, a large number of new results was produced. Nevertheless, the work on the subject is, even at present, a superposition of several approaches expressed in different mathematical formalisms and weakly linked to each other. The purpose of this paper is to supply a unified framework for describing quantum chaos using the quantum ergodic hierarchy. Using the factorization property of this framework, we characterize the dynamical aspects of quantum chaos by obtaining the Ehrenfest time. We also outline a generalization of the quantum mixing level of the kicked rotator in the context of the impulsive differential equations. Full article
(This article belongs to the Special Issue Foundations of Quantum Mechanics)
Figures

Figure 1

Open AccessLetter Discovery of Kolmogorov Scaling in the Natural Language
Entropy 2017, 19(5), 198; doi:10.3390/e19050198
Received: 16 February 2017 / Revised: 25 April 2017 / Accepted: 26 April 2017 / Published: 2 May 2017
PDF Full-text (4928 KB) | HTML Full-text | XML Full-text
Abstract
We consider the rate R and variance σ2 of Shannon information in snippets of text based on word frequencies in the natural language. We empirically identify Kolmogorov’s scaling law in σ2k-1.66±0.12
[...] Read more.
We consider the rate R and variance σ 2 of Shannon information in snippets of text based on word frequencies in the natural language. We empirically identify Kolmogorov’s scaling law in σ 2 k - 1 . 66 ± 0 . 12 (95% c.l.) as a function of k = 1 / N measured by word count N. This result highlights a potential association of information flow in snippets, analogous to energy cascade in turbulent eddies in fluids at high Reynolds numbers. We propose R and σ 2 as robust utility functions for objective ranking of concordances in efficient search for maximal information seamlessly across different languages and as a starting point for artificial attention. Full article
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science)
Figures

Figure 1

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
E-Mail: 
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy Edit a special issue Review for Entropy
loading...
Back to Top