Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 15, Issue 6 (June 2013), Pages 1963-2463

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-25
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle Metric Structure of the Space of Two-Qubit Gates, Perfect Entanglers and Quantum Control
Entropy 2013, 15(6), 1963-1984; doi:10.3390/e15061963
Received: 7 April 2013 / Revised: 17 April 2013 / Accepted: 17 May 2013 / Published: 23 May 2013
Cited by 2 | PDF Full-text (1013 KB) | HTML Full-text | XML Full-text
Abstract
We derive expressions for the invariant length element and measure for the simple compact Lie group SU(4) in a coordinate system particularly suitable for treating entanglement in quantum information processing. Using this metric, we compute the invariant volume of the space of [...] Read more.
We derive expressions for the invariant length element and measure for the simple compact Lie group SU(4) in a coordinate system particularly suitable for treating entanglement in quantum information processing. Using this metric, we compute the invariant volume of the space of two-qubit perfect entanglers. We find that this volume corresponds to more than 84% of the total invariant volume of the space of two-qubit gates. This same metric is also used to determine the effective target sizes that selected gates will present in any quantum-control procedure designed to implement them. Full article
(This article belongs to the Special Issue Quantum Information 2012)
Open AccessArticle Inequality of Chances as a Symmetry Phase Transition
Entropy 2013, 15(6), 1985-1998; doi:10.3390/e15061985
Received: 7 February 2013 / Revised: 27 April 2013 / Accepted: 15 May 2013 / Published: 23 May 2013
PDF Full-text (360 KB) | HTML Full-text | XML Full-text
Abstract
We propose a model for Lorenz curves. It provides two-parameter fits to data on incomes, electric consumption, life expectation and rate of survival after cancer. Graphs result from the condition of maximum entropy and from the symmetry of statistical distributions. Differences in [...] Read more.
We propose a model for Lorenz curves. It provides two-parameter fits to data on incomes, electric consumption, life expectation and rate of survival after cancer. Graphs result from the condition of maximum entropy and from the symmetry of statistical distributions. Differences in populations composing a binary system (poor and rich, young and old, etc.) bring about chance inequality. Symmetrical distributions insure equality of chances, generate Gini coefficients Gi £ ⅓, and imply that nobody gets more than twice the per capita benefit. Graphs generated by different symmetric distributions, but having the same Gini coefficient, intersect an even number of times. The change toward asymmetric distributions follows the pattern set by second-order phase transitions in physics, in particular universality: Lorenz plots reduce to a single universal curve after normalisation and scaling. The order parameter is the difference between cumulated benefit fractions for equal and unequal chances. The model also introduces new parameters: a cohesion range describing the extent of apparent equality in an unequal society, a poor-rich asymmetry parameter, and a new Gini-like indicator that measures unequal-chance inequality and admits a theoretical expression in closed form. Full article
Figures

Open AccessArticle Bias Adjustment for a Nonparametric Entropy Estimator
Entropy 2013, 15(6), 1999-2011; doi:10.3390/e15061999
Received: 20 March 2013 / Revised: 8 May 2013 / Accepted: 17 May 2013 / Published: 23 May 2013
Cited by 2 | PDF Full-text (409 KB) | HTML Full-text | XML Full-text
Abstract
Zhang in 2012 introduced a nonparametric estimator of Shannon’s entropy, whose bias decays exponentially fast when the alphabet is finite. We propose a methodology to estimate the bias of this estimator. We then use it to construct a new estimator of entropy. [...] Read more.
Zhang in 2012 introduced a nonparametric estimator of Shannon’s entropy, whose bias decays exponentially fast when the alphabet is finite. We propose a methodology to estimate the bias of this estimator. We then use it to construct a new estimator of entropy. Simulation results suggest that this bias adjusted estimator has a significantly lower bias than many other commonly used estimators. We consider both the case when the alphabet is finite and when it is countably infinite. Full article
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
Open AccessArticle The Dynamics of Concepts in a Homogeneous Community
Entropy 2013, 15(6), 2012-2022; doi:10.3390/e15062012
Received: 1 April 2013 / Revised: 13 May 2013 / Accepted: 21 May 2013 / Published: 23 May 2013
Cited by 1 | PDF Full-text (289 KB) | HTML Full-text | XML Full-text
Abstract
The paper addresses informational interactions in a community and considers the dynamics of concepts that represent distribution of knowledge among the individuals. The evolution of a set of concepts maintained by a community is derived by the use of the concepts’ significance [...] Read more.
The paper addresses informational interactions in a community and considers the dynamics of concepts that represent distribution of knowledge among the individuals. The evolution of a set of concepts maintained by a community is derived by the use of the concepts’ significance in the communication between “cognoscenti” and “dilettanti” and of birth-death processes. The dynamics of concepts depend on the allocation of communication resources and can be governed by an informational principle that requires minimum self-information of the set of concepts over a time horizon. With respect to that principle, the introduction of a new concept into a community’s disposal is shown to lead to a steady-state self-information, which is smaller than that before the introduction of the new concept. Full article
Open AccessArticle Reliability of Inference of Directed Climate Networks Using Conditional Mutual Information
Entropy 2013, 15(6), 2023-2045; doi:10.3390/e15062023
Received: 30 January 2013 / Revised: 11 May 2013 / Accepted: 14 May 2013 / Published: 24 May 2013
Cited by 23 | PDF Full-text (746 KB) | HTML Full-text | XML Full-text
Abstract
Across geosciences, many investigated phenomena relate to specific complex systems consisting of intricately intertwined interacting subsystems. Such dynamical complex systems can be represented by a directed graph, where each link denotes an existence of a causal relation, or information exchange between the [...] Read more.
Across geosciences, many investigated phenomena relate to specific complex systems consisting of intricately intertwined interacting subsystems. Such dynamical complex systems can be represented by a directed graph, where each link denotes an existence of a causal relation, or information exchange between the nodes. For geophysical systems such as global climate, these relations are commonly not theoretically known but estimated from recorded data using causality analysis methods. These include bivariate nonlinear methods based on information theory and their linear counterpart. The trade-off between the valuable sensitivity of nonlinear methods to more general interactions and the potentially higher numerical reliability of linear methods may affect inference regarding structure and variability of climate networks. We investigate the reliability of directed climate networks detected by selected methods and parameter settings, using a stationarized model of dimensionality-reduced surface air temperature data from reanalysis of 60-year global climate records. Overall, all studied bivariate causality methods provided reproducible estimates of climate causality networks, with the linear approximation showing higher reliability than the investigated nonlinear methods. On the example dataset, optimizing the investigated nonlinear methods with respect to reliability increased the similarity of the detected networks to their linear counterparts, supporting the particular hypothesis of the near-linearity of the surface air temperature reanalysis data. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Figures

Open AccessArticle Generalized Least Energy of Separation for Desalination and Other Chemical Separation Processes
Entropy 2013, 15(6), 2046-2080; doi:10.3390/e15062046
Received: 22 April 2013 / Revised: 14 May 2013 / Accepted: 22 May 2013 / Published: 27 May 2013
Cited by 11 | PDF Full-text (705 KB) | HTML Full-text | XML Full-text
Abstract
Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies driven by different combinations of heat, work, and chemical energy. This paper develops a consistent basis for comparing the energy consumption of [...] Read more.
Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies driven by different combinations of heat, work, and chemical energy. This paper develops a consistent basis for comparing the energy consumption of such technologies using Second Law efficiency. The Second Law efficiency for a chemical separation process is defined in terms of the useful exergy output, which is the minimum least work of separation required to extract a unit of product from a feed stream of a given composition. For a desalination process, this is the minimum least work of separation for producing one kilogram of product water from feed of a given salinity. While definitions in terms of work and heat input have been proposed before, this work generalizes the Second Law efficiency to allow for systems that operate on a combination of energy inputs, including fuel. The generalized equation is then evaluated through a parametric study considering work input, heat inputs at various temperatures, and various chemical fuel inputs. Further, since most modern, large-scale desalination plants operate in cogeneration schemes, a methodology for correctly evaluating Second Law efficiency for the desalination plant based on primary energy inputs is demonstrated. It is shown that, from a strictly energetic point of view and based on currently available technology, cogeneration using electricity to power a reverse osmosis system is energetically superior to thermal systems such as multiple effect distillation and multistage flash distillation, despite the very low grade heat input normally applied in those systems. Full article
Open AccessArticle Analysis of Entropy Generation Rate in an Unsteady Porous Channel Flow with Navier Slip and Convective Cooling
Entropy 2013, 15(6), 2081-2099; doi:10.3390/e15062081
Received: 3 April 2013 / Revised: 6 May 2013 / Accepted: 21 May 2013 / Published: 28 May 2013
Cited by 9 | PDF Full-text (698 KB) | HTML Full-text | XML Full-text
Abstract
This study deals with the combined effects of Navier Slip, Convective cooling, variable viscosity, and suction/injection on the entropy generation rate in an unsteady flow of an incompressible viscous fluid flowing through a channel with permeable walls. The model equations for momentum [...] Read more.
This study deals with the combined effects of Navier Slip, Convective cooling, variable viscosity, and suction/injection on the entropy generation rate in an unsteady flow of an incompressible viscous fluid flowing through a channel with permeable walls. The model equations for momentum and energy balance are solved numerically using semi-discretization finite difference techniques. Both the velocity and temperature profiles are obtained and utilized to compute the entropy generation number. The effects of key parameters on the fluid velocity, temperature, entropy generation rate and Bejan number are depicted graphically and analyzed in detail. Full article
Open AccessArticle Zero Delay Joint Source Channel Coding for Multivariate Gaussian Sources over Orthogonal Gaussian Channels
Entropy 2013, 15(6), 2129-2161; doi:10.3390/e15062129
Received: 26 April 2013 / Revised: 18 May 2013 / Accepted: 21 May 2013 / Published: 31 May 2013
Cited by 2 | PDF Full-text (1615 KB) | HTML Full-text | XML Full-text
Abstract
Communication of a multivariate Gaussian source transmitted over orthogonal additive white Gaussian noise channels using delay-free joint source channel codes (JSCC) is studied in this paper. Two scenarios are considered: (1) all components of the multivariate Gaussian are transmitted by one encoder [...] Read more.
Communication of a multivariate Gaussian source transmitted over orthogonal additive white Gaussian noise channels using delay-free joint source channel codes (JSCC) is studied in this paper. Two scenarios are considered: (1) all components of the multivariate Gaussian are transmitted by one encoder as a vector or several ideally collaborating nodes in a network; (2) the multivariate Gaussian is transmitted through distributed nodes in a sensor network. In both scenarios, the goal is to recover all components of the multivariate Gaussian at the receiver. The paper investigates a subset of JSCC consisting of direct source-to-channel mappings that operate on a symbol-by-symbol basis to ensure zero coding delay. A theoretical analysis that helps explain and quantify distortion behavior for such JSCC is given. Relevant performance bounds for the network are also derived with no constraints on complexity and delay. Optimal linear schemes for both scenarios are presented. Results for Scenario 1 show that linear mappings perform well, except when correlation is high. In Scenario 2, linear mappings provide no gain from correlation when the channel signal-to-noise ratio (SNR) gets large. The gap to the performance upper bound is large for both scenarios, regardless of SNR, when the correlation is high. The main contribution of this paper is the investigation of nonlinear mappings for both scenarios. It is shown that nonlinear mappings can provide substantial gain compared to optimal linear schemes when correlation is high. Contrary to linear mappings for Scenario 2, carefully chosen nonlinear mappings provide a gain for all SNR, as long as the correlation is close to one. Both linear and nonlinear mappings are robust against variations in SNR. Full article
Open AccessArticle Thermoelectric System in Different Thermal and Electrical Configurations: Its Impact in the Figure of Merit
Entropy 2013, 15(6), 2162-2180; doi:10.3390/e15062162
Received: 1 April 2013 / Revised: 25 May 2013 / Accepted: 28 May 2013 / Published: 31 May 2013
Cited by 5 | PDF Full-text (1156 KB) | HTML Full-text | XML Full-text
Abstract
In this work, we analyze different configurations of a thermoelectric system (TES) composed of three thermoelectric generators (TEGs). We present the following considerations: (a) TES thermally and electrically connected in series (SC); (b) TES thermally and electrically connected in parallel (PSC); and [...] Read more.
In this work, we analyze different configurations of a thermoelectric system (TES) composed of three thermoelectric generators (TEGs). We present the following considerations: (a) TES thermally and electrically connected in series (SC); (b) TES thermally and electrically connected in parallel (PSC); and (c) parallel thermally and series electrical connection (SSC). We assume that the parameters of the TEGs are temperature-independent. The systems are characterized by three parameters, as it has been showed in recent investigations, namely, its internal electrical resistance, R, thermal conductance under open electrical circuit condition, K, and Seebeck coefficient α. We derive the equivalent parameters for each of the configurations considered here and calculate the Figure of Merit Z for the equivalent system. We show the impact of the configuration of the system on Z, and we suggest optimum configuration. In order to justify the effectiveness of the equivalent Figure of Merit, the corresponding efficiency has been calculated for each configuration. Full article
(This article belongs to the Special Issue Entropy and Energy Extraction)
Figures

Open AccessArticle An Automatic Multilevel Image Thresholding Using Relative Entropy and Meta-Heuristic Algorithms
Entropy 2013, 15(6), 2181-2209; doi:10.3390/e15062181
Received: 1 March 2013 / Revised: 3 May 2013 / Accepted: 23 May 2013 / Published: 3 June 2013
Cited by 1 | PDF Full-text (1974 KB) | HTML Full-text | XML Full-text
Abstract
Multilevel thresholding has been long considered as one of the most popular techniques for image segmentation. Multilevel thresholding outputs a gray scale image in which more details from the original picture can be kept, while binary thresholding can only analyze the image [...] Read more.
Multilevel thresholding has been long considered as one of the most popular techniques for image segmentation. Multilevel thresholding outputs a gray scale image in which more details from the original picture can be kept, while binary thresholding can only analyze the image in two colors, usually black and white. However, two major existing problems with the multilevel thresholding technique are: it is a time consuming approach, i.e., finding appropriate threshold values could take an exceptionally long computation time; and defining a proper number of thresholds or levels that will keep most of the relevant details from the original image is a difficult task. In this study a new evaluation function based on the Kullback-Leibler information distance, also known as relative entropy, is proposed. The property of this new function can help determine the number of thresholds automatically. To offset the expensive computational effort by traditional exhaustive search methods, this study establishes a procedure that combines the relative entropy and meta-heuristics. From the experiments performed in this study, the proposed procedure not only provides good segmentation results when compared with a well known technique such as Otsu’s method, but also constitutes a very efficient approach. Full article
Figures

Open AccessArticle Entropy Harvesting
Entropy 2013, 15(6), 2210-2217; doi:10.3390/e15062210
Received: 16 February 2013 / Revised: 3 May 2013 / Accepted: 24 May 2013 / Published: 4 June 2013
PDF Full-text (253 KB) | HTML Full-text | XML Full-text
Abstract
The paper introduces the notion of “entropy harvesting” in physical and biological systems. Various physical and natural systems demonstrate the ability to decrease entropy under external stimuli. These systems, including stretched synthetic polymers, muscles, osmotic membranes and suspensions containing small hydrophobic particles, [...] Read more.
The paper introduces the notion of “entropy harvesting” in physical and biological systems. Various physical and natural systems demonstrate the ability to decrease entropy under external stimuli. These systems, including stretched synthetic polymers, muscles, osmotic membranes and suspensions containing small hydrophobic particles, are called “entropic harvesters”. Entropic force acting in these systems increases with temperature. Harvested entropy may be released as mechanical work. The efficiency of entropy harvesting increases when the temperature is decreased. Natural and artificial energy harvesters are presented. Gravity as an entropic effect is discussed. Full article
Open AccessArticle Vessel Pattern Knowledge Discovery from AIS Data: A Framework for Anomaly Detection and Route Prediction
Entropy 2013, 15(6), 2218-2245; doi:10.3390/e15062218
Received: 1 March 2013 / Revised: 10 May 2013 / Accepted: 29 May 2013 / Published: 4 June 2013
Cited by 27 | PDF Full-text (3845 KB) | HTML Full-text | XML Full-text
Abstract
Understanding maritime traffic patterns is key to Maritime Situational Awareness applications, in particular, to classify and predict activities. Facilitated by the recent build-up of terrestrial networks and satellite constellations of Automatic Identification System (AIS) receivers, ship movement information is becoming increasingly available, [...] Read more.
Understanding maritime traffic patterns is key to Maritime Situational Awareness applications, in particular, to classify and predict activities. Facilitated by the recent build-up of terrestrial networks and satellite constellations of Automatic Identification System (AIS) receivers, ship movement information is becoming increasingly available, both in coastal areas and open waters. The resulting amount of information is increasingly overwhelming to human operators, requiring the aid of automatic processing to synthesize the behaviors of interest in a clear and effective way. Although AIS data are only legally required for larger vessels, their use is growing, and they can be effectively used to infer different levels of contextual information, from the characterization of ports and off-shore platforms to spatial and temporal distributions of routes. An unsupervised and incremental learning approach to the extraction of maritime movement patterns is presented here to convert from raw data to information supporting decisions. This is a basis for automatically detecting anomalies and projecting current trajectories and patterns into the future. The proposed methodology, called TREAD (Traffic Route Extraction and Anomaly Detection) was developed for different levels of intermittency (i.e., sensor coverage and performance), persistence (i.e., time lag between subsequent observations) and data sources (i.e., ground-based and space-based receivers). Full article
Figures

Open AccessArticle Bootstrap Methods for the Empirical Study of Decision-Making and Information Flows in Social Systems
Entropy 2013, 15(6), 2246-2276; doi:10.3390/e15062246
Received: 15 March 2013 / Revised: 21 May 2013 / Accepted: 30 May 2013 / Published: 5 June 2013
Cited by 4 | PDF Full-text (461 KB) | HTML Full-text | XML Full-text
Abstract
We characterize the statistical bootstrap for the estimation of informationtheoretic quantities from data, with particular reference to its use in the study of large-scale social phenomena. Our methods allow one to preserve, approximately, the underlying axiomatic relationships of information theory—in particular, consistency [...] Read more.
We characterize the statistical bootstrap for the estimation of informationtheoretic quantities from data, with particular reference to its use in the study of large-scale social phenomena. Our methods allow one to preserve, approximately, the underlying axiomatic relationships of information theory—in particular, consistency under arbitrary coarse-graining—that motivate use of these quantities in the first place, while providing reliability comparable to the state of the art for Bayesian estimators. We show how information-theoretic quantities allow for rigorous empirical study of the decision-making capacities of rational agents, and the time-asymmetric flows of information in distributed systems. We provide illustrative examples by reference to ongoing collaborative work on the semantic structure of the British Criminal Court system and the conflict dynamics of the contemporary Afghanistan insurgency. Full article
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
Open AccessArticle Entropy-Based Fast Largest Coding Unit Partition Algorithm in High-Efficiency Video Coding
Entropy 2013, 15(6), 2277-2287; doi:10.3390/e15062277
Received: 3 April 2013 / Revised: 22 April 2013 / Accepted: 30 May 2013 / Published: 6 June 2013
Cited by 5 | PDF Full-text (602 KB) | HTML Full-text | XML Full-text
Abstract
High-efficiency video coding (HEVC) is a new video coding standard being developed by the Joint Collaborative Team on Video Coding. HEVC adopted numerous new tools, such as more flexible data structure representations, which include the coding unit (CU), prediction unit, and transform [...] Read more.
High-efficiency video coding (HEVC) is a new video coding standard being developed by the Joint Collaborative Team on Video Coding. HEVC adopted numerous new tools, such as more flexible data structure representations, which include the coding unit (CU), prediction unit, and transform unit. In the partitioning of the largest coding unit (LCU) into CUs, rate distortion optimization (RDO) is applied. However, the computation complexity of RDO is too high for real-time application scenarios. Based on studies on the relationship between CUs and their entropy, this paper proposes a fast algorithm based on entropy to partition LCU as a substitute for RDO in HEVC. Experimental results show that the proposed entropy-based LCU partition algorithm can reduce coding time by 62.3% on average, with an acceptable loss of 3.82% using Bjøntegaard delta rate. Full article
Open AccessArticle Multi-Granulation Entropy and Its Applications
Entropy 2013, 15(6), 2288-2302; doi:10.3390/e15062288
Received: 2 April 2013 / Revised: 22 May 2013 / Accepted: 30 May 2013 / Published: 6 June 2013
Cited by 3 | PDF Full-text (248 KB) | HTML Full-text | XML Full-text
Abstract
In the view of granular computing, some general uncertainty measures are proposed through single-granulation by generalizing Shannon’s entropy. However, in the practical environment we need to describe concurrently a target concept through multiple binary relations. In this paper, we extend the classical [...] Read more.
In the view of granular computing, some general uncertainty measures are proposed through single-granulation by generalizing Shannon’s entropy. However, in the practical environment we need to describe concurrently a target concept through multiple binary relations. In this paper, we extend the classical information entropy model to a multi-granulation entropy model (MGE) by using a series of general binary relations. Two types of MGE are discussed. Moreover, a number of theorems are obtained. It can be concluded that the single-granulation entropy is the special instance of MGE. We employ the proposed model to evaluate the significance of the attributes for classification. A forward greedy search algorithm for feature selection is constructed. The experimental results show that the proposed method presents an effective solution for feature analysis. Full article
Open AccessArticle An Entropy-Based Weighted Concept Lattice for Merging Multi-Source Geo-Ontologies
Entropy 2013, 15(6), 2303-2318; doi:10.3390/e15062303
Received: 26 March 2013 / Revised: 14 May 2013 / Accepted: 1 June 2013 / Published: 7 June 2013
Cited by 8 | PDF Full-text (384 KB) | HTML Full-text | XML Full-text
Abstract
To deal with the complexities associated with the rapid growth in a merged concept lattice, a formal method based on an entropy-based weighted concept lattice (EWCL) is proposed as a mechanism for merging multi-source geographic ontologies (geo-ontologies). First, formal concept analysis (FCA) [...] Read more.
To deal with the complexities associated with the rapid growth in a merged concept lattice, a formal method based on an entropy-based weighted concept lattice (EWCL) is proposed as a mechanism for merging multi-source geographic ontologies (geo-ontologies). First, formal concept analysis (FCA) is used to formalize different term-based representations in relation to the geographic domain, and to construct a merged formal context. Second, a weighted concept lattice (WCL) is applied to reduce the merged concept lattice, based on information entropy and a deviance analysis. The entropy of the attribute set is exploited to acquire the intent weight value, and the standard deviation contributes to computing the intent importance deviance value, according to the user preferences and interests. Some nodes of the merged concept lattice are then removed if their intent weights are lower than the intent importance thresholds specified by the user. Finally, experiments were conducted by combining fundamental geographic information data and spatial data in the hydraulic engineering domain from China. The results indicate that the proposed method is feasible and valid for reducing the complexities associated with the merging of geo-ontologies. Although there are still some problems in the application, the manuscript offers a new approach for the merging of geo-ontologies. Full article
Open AccessArticle Characterization of Ecological Exergy Based on Benthic Macroinvertebrates in Lotic Ecosystems
Entropy 2013, 15(6), 2319-2339; doi:10.3390/e15062319
Received: 21 March 2013 / Revised: 30 May 2013 / Accepted: 1 June 2013 / Published: 7 June 2013
Cited by 4 | PDF Full-text (358 KB) | HTML Full-text | XML Full-text
Abstract
The evaluation of ecosystem health is a fundamental process for conducting effective ecosystem management. Ecological exergy is used primarily to summarize the complex dynamics of lotic ecosystems. In this study, we characterized the functional aspects of lotic ecosystems based on the exergy [...] Read more.
The evaluation of ecosystem health is a fundamental process for conducting effective ecosystem management. Ecological exergy is used primarily to summarize the complex dynamics of lotic ecosystems. In this study, we characterized the functional aspects of lotic ecosystems based on the exergy and specific exergy from headwaters to downstream regions in the river’s dimensions (i.e., river width and depth) and in parallel with the nutrient gradient. Data were extracted from the Ecologische Karakterisering van Oppervlaktewateren in Overijssel (EKOO) database, consisting of 249 lotic study sites (including springs, upper, middle and lower courses) and 690 species. Exergy values were calculated based on trophic groups (carnivores, detritivores, detriti-herbivores, herbivores and omnivores) of benthic macroinvertebrate communities. A Self-Organizing Map (SOM) was applied to characterize the different benthic macroinvertebrate communities in the lotic ecosystem, and the Random Forest model was used to predict the exergy and specific exergy based on environmental variables. The SOM classified the sampling sites into four clusters representing differences in the longitudinal distribution along the river, as well as along nutrient gradients. Exergy tended to increase with stream size, and specific exergy was lowest at sites with a high nutrient load. The Random Forest model results indicated that river width was the most important predictor of exergy followed by dissolved oxygen, ammonium and river depth. Orthophosphate was the most significant predictor for estimating specific exergy followed by nitrate and total phosphate. Exergy and specific exergy exhibited different responses to various environmental conditions. This result suggests that the combination of exergy and specific exergy, as complementary indicators, can be used reliably to evaluate the health condition of a lotic ecosystem. Full article
Open AccessArticle Quantum Contextuality with Stabilizer States
Entropy 2013, 15(6), 2340-2362; doi:10.3390/e15062340
Received: 1 April 2013 / Revised: 29 May 2013 / Accepted: 1 June 2013 / Published: 7 June 2013
Cited by 6 | PDF Full-text (468 KB) | HTML Full-text | XML Full-text
Abstract
The Pauli groups are ubiquitous in quantum information theory because of their usefulness in describing quantum states and operations and their readily understood symmetry properties. In addition, the most well-understood quantum error correcting codes—stabilizer codes—are built using Pauli operators. The eigenstates of [...] Read more.
The Pauli groups are ubiquitous in quantum information theory because of their usefulness in describing quantum states and operations and their readily understood symmetry properties. In addition, the most well-understood quantum error correcting codes—stabilizer codes—are built using Pauli operators. The eigenstates of these operators—stabilizer states—display a structure (e.g., mutual orthogonality relationships) that has made them useful in examples of multi-qubit non-locality and contextuality. Here, we apply the graph-theoretical contextuality formalism of Cabello, Severini and Winter to sets of stabilizer states, with particular attention to the effect of generalizing two-level qubit systems to odd prime d-level qudit systems. While state-independent contextuality using two-qubit states does not generalize to qudits, we show explicitly how state-dependent contextuality associated with a Bell inequality does generalize. Along the way we note various structural properties of stabilizer states, with respect to their orthogonality relationships, which may be of independent interest. Full article
(This article belongs to the Special Issue Quantum Information 2012)
Open AccessArticle Entropy Increase in Switching Systems
Entropy 2013, 15(6), 2363-2383; doi:10.3390/e15062363
Received: 10 May 2013 / Revised: 3 June 2013 / Accepted: 3 June 2013 / Published: 7 June 2013
Cited by 4 | PDF Full-text (156 KB) | HTML Full-text | XML Full-text
Abstract
The relation between the complexity of a time-switched dynamics and the complexity of its control sequence depends critically on the concept of a non-autonomous pullback attractor. For instance, the switched dynamics associated with scalar dissipative affine maps has a pullback attractor consisting [...] Read more.
The relation between the complexity of a time-switched dynamics and the complexity of its control sequence depends critically on the concept of a non-autonomous pullback attractor. For instance, the switched dynamics associated with scalar dissipative affine maps has a pullback attractor consisting of singleton component sets. This entails that the complexity of the control sequence and switched dynamics, as quantified by the topological entropy, coincide. In this paper we extend the previous framework to pullback attractors with nontrivial components sets in order to gain further insights in that relation. This calls, in particular, for distinguishing two distinct contributions to the complexity of the switched dynamics. One proceeds from trajectory segments connecting different component sets of the attractor; the other contribution proceeds from trajectory segments within the component sets. We call them “macroscopic” and “microscopic” complexity, respectively, because only the first one can be measured by our analytical tools. As a result of this picture, we obtain sufficient conditions for a switching system to be more complex than its unswitched subsystems, i.e., a complexity analogue of Parrondo’s paradox. Full article
(This article belongs to the Special Issue Dynamical Systems) Print Edition available
Open AccessArticle Entropy of Shortest Distance (ESD) as Pore Detector and Pore-Shape Classifier
Entropy 2013, 15(6), 2384-2397; doi:10.3390/e15062384
Received: 25 March 2013 / Revised: 7 May 2013 / Accepted: 15 May 2013 / Published: 10 June 2013
Cited by 2 | PDF Full-text (2005 KB) | HTML Full-text | XML Full-text
Abstract
The entropy of shortest distance (ESD) between geographic elements (“elliptical intrusions”, “lineaments”, “points”) on a map, or between "vugs", "fractures" and "pores" in the macro- or microscopic images of triple porosity naturally fractured vuggy carbonates provides a powerful new tool for the [...] Read more.
The entropy of shortest distance (ESD) between geographic elements (“elliptical intrusions”, “lineaments”, “points”) on a map, or between "vugs", "fractures" and "pores" in the macro- or microscopic images of triple porosity naturally fractured vuggy carbonates provides a powerful new tool for the digital processing, analysis, classification and space/time distribution prognostic of mineral resources as well as the void space in carbonates, and in other rocks. The procedure is applicable at all scales, from outcrop photos, FMI, UBI, USI (geophysical imaging techniques) to micrographs, as we shall illustrate through some examples. Out of the possible applications of the ESD concept, we discuss in details the sliding window entropy filtering for nonlinear pore boundary enhancement, and propose this procedure as unbiased thresholding technique. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Open AccessArticle On the Use of Information Theory to Quantify Parameter Uncertainty in Groundwater Modeling
Entropy 2013, 15(6), 2398-2414; doi:10.3390/e15062398
Received: 16 February 2013 / Revised: 3 June 2013 / Accepted: 5 June 2013 / Published: 13 June 2013
PDF Full-text (1002 KB) | HTML Full-text | XML Full-text
Abstract
We applied information theory to quantify parameter uncertainty in a groundwater flow model. A number of parameters in groundwater modeling are often used with lack of knowledge of site conditions due to heterogeneity of hydrogeologic properties and limited access to complex geologic [...] Read more.
We applied information theory to quantify parameter uncertainty in a groundwater flow model. A number of parameters in groundwater modeling are often used with lack of knowledge of site conditions due to heterogeneity of hydrogeologic properties and limited access to complex geologic structures. The present Information Theory-based (ITb) approach is to adopt entropy as a measure of uncertainty at the most probable state of hydrogeologic conditions. The most probable conditions are those at which the groundwater model is optimized with respect to the uncertain parameters. An analytical solution to estimate parameter uncertainty is derived by maximizing the entropy subject to constraints imposed by observation data. MODFLOW-2000 is implemented to simulate the groundwater system and to optimize the unknown parameters. The ITb approach is demonstrated with a three-dimensional synthetic model application and a case study of the Kansas City Plant. Hydraulic heads are the observations and hydraulic conductivities are assumed to be the unknown parameters. The applications show that ITb is capable of identifying which inputs of a groundwater model are the most uncertain and what statistical information can be used for site exploration. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Open AccessArticle A Method for Choosing an Initial Time Eigenstate in Classical and Quantum Systems
Entropy 2013, 15(6), 2415-2430; doi:10.3390/e15062415
Received: 23 April 2013 / Revised: 29 May 2013 / Accepted: 3 June 2013 / Published: 17 June 2013
Cited by 2 | PDF Full-text (1229 KB) | HTML Full-text | XML Full-text
Abstract
A subject of interest in classical and quantum mechanics is the development of the appropriate treatment of the time variable. In this paper we introduce a method of choosing the initial time eigensurface and how this method can be used to generate [...] Read more.
A subject of interest in classical and quantum mechanics is the development of the appropriate treatment of the time variable. In this paper we introduce a method of choosing the initial time eigensurface and how this method can be used to generate time-energy coordinates and, consequently, time-energy representations for classical and quantum systems. Full article
(This article belongs to the Special Issue Dynamical Systems) Print Edition available
Figures

Open AccessArticle Effect of Prey Refuge on the Spatiotemporal Dynamics of a Modified Leslie-Gower Predator-Prey System with Holling Type III Schemes
Entropy 2013, 15(6), 2431-2447; doi:10.3390/e15062431
Received: 27 March 2013 / Revised: 5 June 2013 / Accepted: 6 June 2013 / Published: 19 June 2013
Cited by 3 | PDF Full-text (1087 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, the spatiotemporal dynamics of a diffusive Leslie-Gower predator-prey model with prey refuge are investigated analytically and numerically. Mathematical theoretical works have considered the existence of global solutions, population permanence and the stability of equilibrium points, which depict the threshold [...] Read more.
In this paper, the spatiotemporal dynamics of a diffusive Leslie-Gower predator-prey model with prey refuge are investigated analytically and numerically. Mathematical theoretical works have considered the existence of global solutions, population permanence and the stability of equilibrium points, which depict the threshold expressions of some critical parameters. Numerical simulations are performed to explore the pattern formation of species. These results show that the prey refuge has a profound effect on predator-prey interactions and they have the potential to be useful for the study of the entropy theory of bioinformatics. Full article
(This article belongs to the Special Issue Dynamical Systems) Print Edition available
Open AccessArticle A Maximum Entropy Approach to the Realizability of Spin Correlation Matrices
Entropy 2013, 15(6), 2448-2463; doi:10.3390/e15062448
Received: 26 February 2013 / Revised: 10 June 2013 / Accepted: 15 June 2013 / Published: 21 June 2013
PDF Full-text (287 KB) | HTML Full-text | XML Full-text
Abstract
Deriving the form of the optimal solution of a maximum entropy problem, we obtain an infinite family of linear inequalities characterizing the polytope of spin correlation matrices. For n ≤ 6, the facet description of such a polytope is provided through a [...] Read more.
Deriving the form of the optimal solution of a maximum entropy problem, we obtain an infinite family of linear inequalities characterizing the polytope of spin correlation matrices. For n ≤ 6, the facet description of such a polytope is provided through a minimal system of Bell-type inequalities. Full article

Review

Jump to: Research

Open AccessReview Quantum Thermodynamics: A Dynamical Viewpoint
Entropy 2013, 15(6), 2100-2128; doi:10.3390/e15062100
Received: 26 March 2013 / Revised: 21 May 2013 / Accepted: 23 May 2013 / Published: 29 May 2013
Cited by 74 | PDF Full-text (755 KB) | HTML Full-text | XML Full-text
Abstract
Quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. The viewpoint advocated is based on the intimate connection of quantum thermodynamics with the theory of open quantum systems. Quantum mechanics inserts dynamics into thermodynamics, giving a sound foundation to finite-time-thermodynamics. [...] Read more.
Quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. The viewpoint advocated is based on the intimate connection of quantum thermodynamics with the theory of open quantum systems. Quantum mechanics inserts dynamics into thermodynamics, giving a sound foundation to finite-time-thermodynamics. The emergence of the 0-law, I-law, II-law and III-law of thermodynamics from quantum considerations is presented. The emphasis is on consistency between the two theories, which address the same subject from different foundations. We claim that inconsistency is the result of faulty analysis, pointing to flaws in approximations. Full article
(This article belongs to the Special Issue Quantum Information 2012)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top