Entropy
http://mdpi.com/journal/entropy
Latest open access articles published in Entropy at http://mdpi.com/journal/entropy<![CDATA[Entropy, Vol. 17, Pages 3518-3551: Entropy vs. Energy Waveform Processing: A Comparison Based on the Heat Equation]]>
http://mdpi.com/1099-4300/17/6/3518
Virtually all modern imaging devices collect electromagnetic or acoustic waves and use the energy carried by these waves to determine pixel values to create what is basically an “energy” picture. However, waves also carry “information”, as quantified by some form of entropy, and this may also be used to produce an “information” image. Numerous published studies have demonstrated the advantages of entropy, or “information imaging”, over conventional methods. The most sensitive information measure appears to be the joint entropy of the collected wave and a reference signal. The sensitivity of repeated experimental observations of a slowly-changing quantity may be defined as the mean variation (i.e., observed change) divided by mean variance (i.e., noise). Wiener integration permits computation of the required mean values and variances as solutions to the heat equation, permitting estimation of their relative magnitudes. There always exists a reference, such that joint entropy has larger variation and smaller variance than the corresponding quantities for signal energy, matching observations of several studies. Moreover, a general prescription for finding an “optimal” reference for the joint entropy emerges, which also has been validated in several studies.Entropy2015-05-25176Article10.3390/e1706351835181099-43002015-05-25doi: 10.3390/e17063518Michael HughesJohn McCarthyPaul BruillardJon MarshSamuel Wickline<![CDATA[Entropy, Vol. 17, Pages 3501-3517: Information Decomposition and Synergy]]>
http://mdpi.com/1099-4300/17/5/3501
Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians.Entropy2015-05-22175Article10.3390/e17053501350135171099-43002015-05-22doi: 10.3390/e17053501Eckehard OlbrichNils BertschingerJohannes Rauh<![CDATA[Entropy, Vol. 17, Pages 3479-3500: Operational Reliability Assessment of Compressor Gearboxes with Normalized Lifting Wavelet Entropy from Condition Monitoring Information]]>
http://mdpi.com/1099-4300/17/5/3479
Classical reliability assessment methods have predominantly focused on probability and statistical theories, which are insufficient in assessing the operational reliability of individual mechanical equipment with time-varying characteristics. A new approach to assess machinery operational reliability with normalized lifting wavelet entropy from condition monitoring information is proposed, which is different from classical reliability assessment methods depending on probability and statistics analysis. The machinery vibration signals with time-varying operational characteristics are firstly decomposed and reconstructed by means of a lifting wavelet package transform. The relative energy of every reconstructed signal is computed as an energy percentage of the reconstructed signal in the whole signal energy. Moreover, a normalized lifting wavelet entropy is defined by the relative energy to reveal the machinery operational uncertainty. Finally, operational reliability degree is defined by the quantitative value obtained by the normalized lifting wavelet entropy belonging to the range of [0, 1]. The proposed method is applied in the operational reliability assessment of the gearbox in an oxy-generator compressor to validate the effectiveness.Entropy2015-05-20175Article10.3390/e17053479347935001099-43002015-05-20doi: 10.3390/e17053479Xiaoli ZhangBaojian WangHongrui CaoBing LiXuefeng Chen<![CDATA[Entropy, Vol. 17, Pages 3461-3478: Nonparametric Denoising Methods Based on Contourlet Transform with Sharp Frequency Localization: Application to Low Exposure Time Electron Microscopy Images]]>
http://mdpi.com/1099-4300/17/5/3461
Image denoising is a very important step in cryo-transmission electron microscopy (cryo-TEM) and the energy filtering TEM images before the 3D tomography reconstruction, as it addresses the problem of high noise in these images, that leads to a loss of the contained information. High noise levels contribute in particular to difficulties in the alignment required for 3D tomography reconstruction. This paper investigates the denoising of TEM images that are acquired with a very low exposure time, with the primary objectives of enhancing the quality of these low-exposure time TEM images and improving the alignment process. We propose denoising structures to combine multiple noisy copies of the TEM images. The structures are based on Bayesian estimation in the transform domains instead of the spatial domain to build a novel feature preserving image denoising structures; namely: wavelet domain, the contourlet transform domain and the contourlet transform with sharp frequency localization. Numerical image denoising experiments demonstrate the performance of the Bayesian approach in the contourlet transform domain in terms of improving the signal to noise ratio (SNR) and recovering fine details that may be hidden in the data. The SNR and the visual quality of the denoised images are considerably enhanced using these denoising structures that combine multiple noisy copies. The proposed methods also enable a reduction in the exposure time.Entropy2015-05-20175Article10.3390/e17053461346134781099-43002015-05-20doi: 10.3390/e17053461Soumia AhmedZoubeida MessaliAbdeldjalil OuahabiSylvain TrepoutCedric MessaoudiSergio Marco<![CDATA[Entropy, Vol. 17, Pages 3458-3460: Maximum Entropy Applied to Inductive Logic and Reasoning]]>
http://mdpi.com/1099-4300/17/5/3458
This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.Entropy2015-05-18175Editorial10.3390/e17053458345834601099-43002015-05-18doi: 10.3390/e17053458Jürgen LandesJon Williamson<![CDATA[Entropy, Vol. 17, Pages 3438-3457: Heat Transfer and Pressure Drop Characteristics in Straight Microchannel of Printed Circuit Heat Exchangers]]>
http://mdpi.com/1099-4300/17/5/3438
Performance tests were carried out for a microchannel printed circuit heat exchanger (PCHE), which was fabricated with micro photo-etching and diffusion bonding technologies. The microchannel PCHE was tested for Reynolds numbers in the range of 100‒850 varying the hot-side inlet temperature between 40 °C–50 °C while keeping the cold-side temperature fixed at 20 °C. It was found that the average heat transfer rate and heat transfer performance of the countercurrrent configuration were 6.8% and 10%‒15% higher, respectively, than those of the parallel flow. The average heat transfer rate, heat transfer performance and pressure drop increased with increasing Reynolds number in all experiments. Increasing inlet temperature did not affect the heat transfer performance while it slightly decreased the pressure drop in the experimental range considered. Empirical correlations have been developed for the heat transfer coefficient and pressure drop factor as functions of the Reynolds number.Entropy2015-05-18175Article10.3390/e17053438343834571099-43002015-05-18doi: 10.3390/e17053438Jang-Won SeoYoon-Ho KimDongseon KimYoung-Don ChoiKyu-Jung Lee<![CDATA[Entropy, Vol. 17, Pages 3419-3437: Minimum Error Entropy Algorithms with Sparsity Penalty Constraints]]>
http://mdpi.com/1099-4300/17/5/3419
Recently, sparse adaptive learning algorithms have been developed to exploit system sparsity as well as to mitigate various noise disturbances in many applications. In particular, in sparse channel estimation, the parameter vector with sparsity characteristic can be well estimated from noisy measurements through a sparse adaptive filter. In previous studies, most works use the mean square error (MSE) based cost to develop sparse filters, which is rational under the assumption of Gaussian distributions. However, Gaussian assumption does not always hold in real-world environments. To address this issue, we incorporate in this work an l1-norm or a reweighted l1-norm into the minimum error entropy (MEE) criterion to develop new sparse adaptive filters, which may perform much better than the MSE based methods, especially in heavy-tailed non-Gaussian situations, since the error entropy can capture higher-order statistics of the errors. In addition, a new approximator of l0-norm, based on the correntropy induced metric (CIM), is also used as a sparsity penalty term (SPT). We analyze the mean square convergence of the proposed new sparse adaptive filters. An energy conservation relation is derived and a sufficient condition is obtained, which ensures the mean square convergence. Simulation results confirm the superior performance of the new algorithms.Entropy2015-05-18175Article10.3390/e17053419341934371099-43002015-05-18doi: 10.3390/e17053419Zongze WuSiyuan PengWentao MaBadong ChenJose Principe<![CDATA[Entropy, Vol. 17, Pages 3400-3418: Entropy Approximation in Lossy Source Coding Problem]]>
http://mdpi.com/1099-4300/17/5/3400
In this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained.Entropy2015-05-18175Article10.3390/e17053400340034181099-43002015-05-18doi: 10.3390/e17053400Marek ŚmiejaJacek Tabor<![CDATA[Entropy, Vol. 17, Pages 3376-3399: Non-Abelian Topological Approach to Non-Locality of a Hypergraph State]]>
http://mdpi.com/1099-4300/17/5/3376
We present a theoretical study of new families of stochastic complex information modules encoded in the hypergraph states which are defined by the fractional entropic descriptor. The essential connection between the Lyapunov exponents and d-regular hypergraph fractal set is elucidated. To further resolve the divergence in the complexity of classical and quantum representation of a hypergraph, we have investigated the notion of non-amenability and its relation to combinatorics of dynamical self-organization for the case of fractal system of free group on finite generators. The exact relation between notion of hypergraph non-locality and quantum encoding through system sets of specified non-Abelian fractal geometric structures is presented. Obtained results give important impetus towards designing of approximation algorithms for chip imprinted circuits in scalable quantum information systems.Entropy2015-05-15175Article10.3390/e17053376337633991099-43002015-05-15doi: 10.3390/e17053376Vesna Berec<![CDATA[Entropy, Vol. 17, Pages 3352-3375: Nonlinear Stochastic Control and Information Theoretic Dualities: Connections, Interdependencies and Thermodynamic Interpretations]]>
http://mdpi.com/1099-4300/17/5/3352
In this paper, we present connections between recent developments on the linearly-solvable stochastic optimal control framework with early work in control theory based on the fundamental dualities between free energy and relative entropy. We extend these connections to nonlinear stochastic systems with non-affine controls by using the generalized version of the Feynman–Kac lemma. We present alternative formulations of the linearly-solvable stochastic optimal control framework and discuss information theoretic and thermodynamic interpretations. On the algorithmic side, we present iterative stochastic optimal control algorithms and applications to nonlinear stochastic systems. We conclude with an overview of the frameworks presented and discuss limitations, differences and future directions.Entropy2015-05-15175Article10.3390/e17053352335233751099-43002015-05-15doi: 10.3390/e17053352Evangelos Theodorou<![CDATA[Entropy, Vol. 17, Pages 3332-3351: An Information-Theoretic Perspective on Coarse-Graining, Including the Transition from Micro to Macro]]>
http://mdpi.com/1099-4300/17/5/3332
An information-theoretic perspective on coarse-graining is presented. It starts with an information characterization of configurations at the micro-level using a local information quantity that has a spatial average equal to a microscopic entropy. With a reversible micro dynamics, this entropy is conserved. In the micro-macro transition, it is shown how this local information quantity is transformed into a macroscopic entropy, as the local states are aggregated into macroscopic concentration variables. The information loss in this transition is identified, and the connection to the irreversibility of the macro dynamics and the second law of thermodynamics is discussed. This is then connected to a process of further coarse-graining towards higher characteristic length scales in the context of chemical reaction-diffusion dynamics capable of pattern formation. On these higher levels of coarse-graining, information flows across length scales and across space are defined. These flows obey a continuity equation for information, and they are connected to the thermodynamic constraints of the system, via an outflow of information from macroscopic to microscopic levels in the form of entropy production, as well as an inflow of information, from an external free energy source, if a spatial chemical pattern is to be maintained.Entropy2015-05-14175Article10.3390/e17053332333233511099-43002015-05-14doi: 10.3390/e17053332Kristian Lindgren<![CDATA[Entropy, Vol. 17, Pages 3319-3331: A Mean-Variance Hybrid-Entropy Model for Portfolio Selection with Fuzzy Returns]]>
http://mdpi.com/1099-4300/17/5/3319
In this paper, we define the portfolio return as fuzzy average yield and risk as hybrid-entropy and variance to deal with the portfolio selection problem with both random uncertainty and fuzzy uncertainty, and propose a mean-variance hybrid-entropy model (MVHEM). A multi-objective genetic algorithm named Non-dominated Sorting Genetic Algorithm II (NSGA-II) is introduced to solve the model. We make empirical comparisons by using the data from the Shanghai and Shenzhen stock exchanges in China. The results show that the MVHEM generally performs better than the traditional portfolio selection models.Entropy2015-05-14175Article10.3390/e17053319331933311099-43002015-05-14doi: 10.3390/e17053319Rongxi ZhouYu ZhanRu CaiGuanqun Tong<![CDATA[Entropy, Vol. 17, Pages 3253-3318: The Homological Nature of Entropy]]>
http://mdpi.com/1099-4300/17/5/3253
We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: (1) classical probabilities and random variables; (2) quantum probabilities and observable operators; (3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all orders, and Kullback–Leibler divergence and generalizes them in several ways. The article is divided into two parts, that can be read independently. In the first part, the introduction, we provide an overview of the results, some open questions, future results and lines of research, and discuss briefly the application to complex data. In the second part we give the complete definitions and proofs of the theorems A, C and E in the introduction, which show why entropy is the first homological invariant of a structure of information in four contexts: static classical or quantum probability, dynamics of classical or quantum strategies of observation of a finite system.Entropy2015-05-13175Article10.3390/e17053253325333181099-43002015-05-13doi: 10.3390/e17053253Pierre BaudotDaniel Bennequin<![CDATA[Entropy, Vol. 17, Pages 3205-3252: Generalized Stochastic Fokker-Planck Equations]]>
http://mdpi.com/1099-4300/17/5/3205
We consider a system of Brownian particles with long-range interactions. We go beyond the mean field approximation and take fluctuations into account. We introduce a new class of stochastic Fokker-Planck equations associated with a generalized thermodynamical formalism. Generalized thermodynamics arises in the case of complex systems experiencing small-scale constraints. In the limit of short-range interactions, we obtain a generalized class of stochastic Cahn-Hilliard equations. Our formalism has application for several systems of physical interest including self-gravitating Brownian particles, colloid particles at a fluid interface, superconductors of type II, nucleation, the chemotaxis of bacterial populations, and two-dimensional turbulence. We also introduce a new type of generalized entropy taking into account anomalous diffusion and exclusion or inclusion constraints.Entropy2015-05-13175Article10.3390/e17053205320532521099-43002015-05-13doi: 10.3390/e17053205Pierre-Henri Chavanis<![CDATA[Entropy, Vol. 17, Pages 3194-3204: Quantum Data Locking for Secure Communication against an Eavesdropper with Time-Limited Storage]]>
http://mdpi.com/1099-4300/17/5/3194
Quantum cryptography allows for unconditionally secure communication against an eavesdropper endowed with unlimited computational power and perfect technologies, who is only constrained by the laws of physics. We review recent results showing that, under the assumption that the eavesdropper can store quantum information only for a limited time, it is possible to enhance the performance of quantum key distribution in both a quantitative and qualitative fashion. We consider quantum data locking as a cryptographic primitive and discuss secure communication and key distribution protocols. For the case of a lossy optical channel, this yields the theoretical possibility of generating secret key at a constant rate of 1 bit per mode at arbitrarily long communication distances.Entropy2015-05-13175Article10.3390/e17053194319432041099-43002015-05-13doi: 10.3390/e17053194Cosmo Lupo<![CDATA[Entropy, Vol. 17, Pages 3182-3193: Exact Solutions of Non-Linear Lattice Equations by an Improved Exp-Function Method]]>
http://mdpi.com/1099-4300/17/5/3182
In this paper, the exp-function method is improved to construct exact solutions of non-linear lattice equations by modifying its exponential function ansätz. The improved method has two advantages. One is that it can solve non-linear lattice equations with variable coefficients, and the other is that it is not necessary to balance the highest order derivative with the highest order nonlinear term in the procedure of determining the exponential function ansätz. To show the advantages of this improved method, a variable-coefficient mKdV lattice equation is considered. As a result, new exact solutions, which include kink-type solutions and bell-kink-type solutions, are obtained.Entropy2015-05-13175Article10.3390/e17053182318231931099-43002015-05-13doi: 10.3390/e17053182Sheng ZhangJiahong LiYingying Zhou<![CDATA[Entropy, Vol. 17, Pages 3172-3181: Existence of Ulam Stability for Iterative Fractional Differential Equations Based on Fractional Entropy]]>
http://mdpi.com/1099-4300/17/5/3172
In this study, we introduce conditions for the existence of solutions for an iterative functional differential equation of fractional order. We prove that the solutions of the above class of fractional differential equations are bounded by Tsallis entropy. The method depends on the concept of Hyers-Ulam stability. The arbitrary order is suggested in the sense of Riemann-Liouville calculus.Entropy2015-05-13175Article10.3390/e17053172317231811099-43002015-05-13doi: 10.3390/e17053172Rabha IbrahimHamid Jalab<![CDATA[Entropy, Vol. 17, Pages 3160-3171: Effect of Heterogeneity in Initial Geographic Distribution on Opinions’ Competitiveness]]>
http://mdpi.com/1099-4300/17/5/3160
Spin dynamics on networks allows us to understand how a global consensus emerges out of individual opinions. Here, we are interested in the effect of heterogeneity in the initial geographic distribution of a competing opinion on the competitiveness of its own opinion. Accordingly, in this work, we studied the effect of spatial heterogeneity on the majority rule dynamics using a three-state spin model, in which one state is neutral. Monte Carlo simulations were performed on square lattices divided into square blocks (cells). Accordingly, one competing opinion was distributed uniformly among cells, whereas the spatial distribution of the rival opinion was varied from the uniform to heterogeneous, with the median-to-mean ratio in the range from 1 to 0. When the size of discussion group is odd, the uncommitted agents disappear completely after 3.30 ± 0.05 update cycles, and then the system evolves in a two-state regime with complementary spatial distributions of two competing opinions. Even so, the initial heterogeneity in the spatial distribution of one of the competing opinions causes a decrease of this opinion competitiveness. That is, the opinion with initially heterogeneous spatial distribution has less probability to win, than the opinion with the initially uniform spatial distribution, even when the initial concentrations of both opinions are equal. We found that although the time to consensus , the opinion’s recession rate is determined during the first 3.3 update cycles. On the other hand, we found that the initial heterogeneity of the opinion spatial distribution assists the formation of quasi-stable regions, in which this opinion is dominant. The results of Monte Carlo simulations are discussed with regard to the electoral competition of political parties.Entropy2015-05-13175Article10.3390/e17053160316031711099-43002015-05-13doi: 10.3390/e17053160Alexander BalankinMiguel Martínez CruzFelipe Gayosso MartínezClaudia Martínez-GonzálezLeobardo Morales RuizJulián Patiño Ortiz<![CDATA[Entropy, Vol. 17, Pages 3152-3159: Continuous-Variable Entanglement Swapping]]>
http://mdpi.com/1099-4300/17/5/3152
We present a very brief overview of entanglement swapping as it relates to continuous-variable quantum information. The technical background required is discussed and the natural link to quantum teleportation is established before discussing the nature of Gaussian entanglement swapping. The limitations of Gaussian swapping are introduced, along with the general applications of swapping in the context of to quantum communication and entanglement distribution. In light of this, we briefly summarize a collection of entanglement swapping schemes which incorporate a non-Gaussian ingredient and the benefits of such schemes are noted. Finally, we motivate the need to further study and develop such schemes by highlighting requirements of a continuous-variable repeater.Entropy2015-05-13175Review10.3390/e17053152315231591099-43002015-05-13doi: 10.3390/e17053152Kevin MarshallChristian Weedbrook<![CDATA[Entropy, Vol. 17, Pages 3124-3151: 2D Temperature Analysis of Energy and Exergy Characteristics of Laminar Steady Flow across a Square Cylinder under Strong Blockage]]>
http://mdpi.com/1099-4300/17/5/3124
Energy and exergy characteristics of a square cylinder (SC) in confined flow are investigated computationally by numerically handling the steady-state continuity, Navier-Stokes and energy equations in the Reynolds number range of Re = 10–50, where the blockage ratio (β = B/H) is kept constant at the high level of β = 0.8. Computations indicated for the upstream region that, the mean non-dimensional streamwise (u/Uo) and spanwise (v/Uo) velocities attain the values of u/Uo = 0.840®0.879 and v/Uo = 0.236®0.386 (Re = 10®50) on the front-surface of the SC, implying that Reynolds number and blockage have stronger impact on the spanwise momentum activity. It is determined that flows with high Reynolds number interact with the front-surface of the SC developing thinner thermal boundary layers and greater temperature gradients, which promotes the thermal entropy generation values as well. The strict guidance of the throat, not only resulted in the fully developed flow character, but also imposed additional cooling; such that the analysis pointed out the drop of duct wall (y = 0.025 m) non-dimensional temperature values (ζ) from ζ = 0.387®0.926 (Re = 10®50) at xth = 0 mm to ζ = 0.002®0.266 at xth = 40 mm. In the downstream region, spanwise thermal disturbances are evaluated to be most inspectable in the vortex driven region, where the temperature values show decrease trends in the spanwise direction. In the corresponding domain, exergy destruction is determined to grow with Reynolds number and decrease in the streamwise direction (xds = 0®10 mm). Besides, asymmetric entropy distributions as well were recorded due to the comprehensive mixing caused by the vortex system.Entropy2015-05-12175Article10.3390/e17053124312431511099-43002015-05-12doi: 10.3390/e17053124M. Korukcu<![CDATA[Entropy, Vol. 17, Pages 3110-3123: The Multiscale Entropy Algorithm and Its Variants: A Review]]>
http://mdpi.com/1099-4300/17/5/3110
Multiscale entropy (MSE) analysis was introduced in the 2002 to evaluate the complexity of a time series by quantifying its entropy over a range of temporal scales. The algorithm has been successfully applied in different research fields. Since its introduction, a number of modifications and refinements have been proposed, some aimed at increasing the accuracy of the entropy estimates, others at exploring alternative coarse-graining procedures. In this review, we first describe the original MSE algorithm. Then, we review algorithms that have been introduced to improve the estimation of MSE. We also report a recent generalization of the method to higher moments.Entropy2015-05-12175Review10.3390/e17053110311031231099-43002015-05-12doi: 10.3390/e17053110Anne Humeau-Heurtier<![CDATA[Entropy, Vol. 17, Pages 3097-3109: Exponential Outer Synchronization between Two Uncertain Time-Varying Complex Networks with Nonlinear Coupling]]>
http://mdpi.com/1099-4300/17/5/3097
This paper studies the problem of exponential outer synchronization between two uncertain nonlinearly coupled complex networks with time delays. In order to synchronize uncertain complex networks, an adaptive control scheme is designed based on the Lyapunov stability theorem. Simultaneously, the unknown system parameters of uncertain complex networks are identified when exponential outer synchronization occurs. Finally, numerical examples are provided to demonstrate the feasibility and effectiveness of the theoretical results.Entropy2015-05-11175Article10.3390/e17053097309731091099-43002015-05-11doi: 10.3390/e17053097Yongqing WuLi Liu<![CDATA[Entropy, Vol. 17, Pages 3053-3096: Predicting Community Evolution in Social Networks]]>
http://mdpi.com/1099-4300/17/5/3053
Nowadays, sustained development of different social media can be observed worldwide. One of the relevant research domains intensively explored recently is analysis of social communities existing in social media as well as prediction of their future evolution taking into account collected historical evolution chains. These evolution chains proposed in the paper contain group states in the previous time frames and its historical transitions that were identified using one out of two methods: Stable Group Changes Identification (SGCI) and Group Evolution Discovery (GED). Based on the observed evolution chains of various length, structural network features are extracted, validated and selected as well as used to learn classification models. The experimental studies were performed on three real datasets with different profile: DBLP, Facebook and Polish blogosphere. The process of group prediction was analysed with respect to different classifiers as well as various descriptive feature sets extracted from evolution chains of different length. The results revealed that, in general, the longer evolution chains the better predictive abilities of the classification models. However, chains of length 3 to 7 enabled the GED-based method to almost reach its maximum possible prediction quality. For SGCI, this value was at the level of 3–5 last periods.Entropy2015-05-11175Article10.3390/e17053053305330961099-43002015-05-11doi: 10.3390/e17053053Stanisław SaganowskiBogdan GliwaPiotr BródkaAnna ZygmuntPrzemysław KazienkoJarosław Koźlak<![CDATA[Entropy, Vol. 17, Pages 3035-3052: Dimensional Upgrade Approach for Spatial-Temporal Fusion of Trend Series in Subsidence Evaluation]]>
http://mdpi.com/1099-4300/17/5/3035
Physical models and grey system models (GSMs) are commonly used to evaluate and predict physical behavior. A physical model avoids the incorrect trend series of a GSM, whereas a GSM avoids the assumptions and uncertainty of a physical model. A technique that combines the results of physical models and GSMs would make prediction more reasonable and reliable. This study proposes a fusion method for combining two trend series, calculated using two one-dimensional models, respectively, that uses a slope criterion and a distance weighting factor in the temporal and spatial domains. The independent one-dimensional evaluations are upgraded to a spatially and temporally connected two-dimensional distribution. The proposed technique was applied to a subsidence problem in Jhuoshuei River Alluvial Fan, Taiwan. The fusion results show dramatic decreases of subsidence quantity and rate compared to those estimated by the GSM. The subsidence behavior estimated using the proposed method is physically reasonable due to a convergent trend of subsidence under the assumption of constant discharge of groundwater. The technique proposed in this study can be used in fields that require a combination of two trend series from physical and nonphysical models.Entropy2015-05-11175Communication10.3390/e17053035303530521099-43002015-05-11doi: 10.3390/e17053035Shih-Jung Wang<![CDATA[Entropy, Vol. 17, Pages 2988-3034: Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences]]>
http://mdpi.com/1099-4300/17/5/2988
This work reviews and extends a family of log-determinant (log-det) divergences for symmetric positive definite (SPD) matrices and discusses their fundamental properties. We show how to use parameterized Alpha-Beta (AB) and Gamma log-det divergences to generate many well-known divergences; in particular, we consider the Stein’s loss, the S-divergence, also called Jensen-Bregman LogDet (JBLD) divergence, Logdet Zero (Bhattacharyya) divergence, Affine Invariant Riemannian Metric (AIRM), and other divergences. Moreover, we establish links and correspondences between log-det divergences and visualise them on an alpha-beta plane for various sets of parameters. We use this unifying framework to interpret and extend existing similarity measures for semidefinite covariance matrices in finite-dimensional Reproducing Kernel Hilbert Spaces (RKHS). This paper also shows how the Alpha-Beta family of log-det divergences relates to the divergences of multivariate and multilinear normal distributions. Closed form formulas are derived for Gamma divergences of two multivariate Gaussian densities; the special cases of the Kullback-Leibler, Bhattacharyya, Rényi, and Cauchy-Schwartz divergences are discussed. Symmetrized versions of log-det divergences are also considered and briefly reviewed. Finally, a class of divergences is extended to multiway divergences for separable covariance (or precision) matrices.Entropy2015-05-08175Review10.3390/e17052988298830341099-43002015-05-08doi: 10.3390/e17052988Andrzej CichockiSergio CrucesShun-ichi Amari<![CDATA[Entropy, Vol. 17, Pages 2973-2987: Kolmogorov Complexity Based Information Measures Applied to the Analysis of Different River Flow Regimes]]>
http://mdpi.com/1099-4300/17/5/2973
We have used the Kolmogorov complexities and the Kolmogorov complexity spectrum to quantify the randomness degree in river flow time series of seven rivers with different regimes in Bosnia and Herzegovina, representing their different type of courses, for the period 1965–1986. In particular, we have examined: (i) the Neretva, Bosnia and the Drina (mountain and lowland parts), (ii) the Miljacka and the Una (mountain part) and the Vrbas and the Ukrina (lowland part) and then calculated the Kolmogorov complexity (KC) based on the Lempel–Ziv Algorithm (LZA) (lower—KCL and upper—KCU), Kolmogorov complexity spectrum highest value (KCM) and overall Kolmogorov complexity (KCO) values for each time series. The results indicate that the KCL, KCU, KCM and KCO values in seven rivers show some similarities regardless of the amplitude differences in their monthly flow rates. The KCL, KCU and KCM complexities as information measures do not “see” a difference between time series which have different amplitude variations but similar random components. However, it seems that the KCO information measures better takes into account both the amplitude and the place of the components in a time series.Entropy2015-05-08175Article10.3390/e17052973297329871099-43002015-05-08doi: 10.3390/e17052973Dragutin MihailovićGordan MimićNusret DreškovićIlija Arsenić<![CDATA[Entropy, Vol. 17, Pages 2958-2972: Maximum Entropy Method for Operational Loads Feedback Using Concrete Dam Displacement]]>
http://mdpi.com/1099-4300/17/5/2958
Safety control of concrete dams is required due to the potential great loss of life and property in case of dam failure. The purpose of this paper is to feed back the operational control loads for concrete dam displacement using the maximum entropy method. The proposed method is not aimed at a judgement about the safety conditions of the dam. When a strong trend-line effect is evident, the method should be carefully applied. In these cases, the hydrostatic and temperature effects are added to the irreversible displacements, thus maximum operational loads should be accordingly reduced. The probability density function for the extreme load effect component of dam displacement can be selected by employing the principle of maximum entropy, which is effective to construct the least subjective probability density distribution merely given the moments information from the stated data. The critical load effect component in the warning criterion can be determined through the corresponding cumulative distribution function obtained by the maximum entropy method. Then the control loads feedback of concrete dam displacement is realized by the proposed warning criterion. The proposed method is applied to a concrete dam. A comparison of the results shows that the maximum entropy method can feed back rational control loads for the dam displacement. The control loads diagram obtained can be a straightforward and visual tool to the operation and management department of the concrete dam. The result from the proposed method is recommended to be used due to minimal subjectivity.Entropy2015-05-08175Article10.3390/e17052958295829721099-43002015-05-08doi: 10.3390/e17052958Jingmei ZhangChongshi Gu<![CDATA[Entropy, Vol. 17, Pages 2932-2957: Oxygen Saturation and RR Intervals Feature Selection for Sleep Apnea Detection]]>
http://mdpi.com/1099-4300/17/5/2932
A diagnostic system for sleep apnea based on oxygen saturation and RR intervals obtained from the EKG (electrocardiogram) is proposed with the goal to detect and quantify minute long segments of sleep with breathing pauses. We measured the discriminative capacity of combinations of features obtained from RR series and oximetry to evaluate improvements of the performance compared to oximetry-based features alone. Time and frequency domain variables derived from oxygen saturation (SpO2) as well as linear and non-linear variables describing the RR series have been explored in recordings from 70 patients with suspected sleep apnea. We applied forward feature selection in order to select a minimal set of variables that are able to locate patterns indicating respiratory pauses. Linear discriminant analysis (LDA) was used to classify the presence of apnea during specific segments. The system will finally provide a global score indicating the presence of clinically significant apnea integrating the segment based apnea detection. LDA results in an accuracy of 87%; sensitivity of 76% and specificity of 91% (AUC = 0.90) with a global classification of 97% when only oxygen saturation is used. In case of additionally including features from the RR series; the system performance improves to an accuracy of 87%; sensitivity of 73% and specificity of 92% (AUC = 0.92), with a global classification rate of 100%.Entropy2015-05-07175Article10.3390/e17052932293229571099-43002015-05-07doi: 10.3390/e17052932Antonio Ravelo-GarcíaJan KraemerJuan Navarro-MesaEduardo Hernández-PérezJavier Navarro-EstevaGabriel Juliá-SerdáThomas PenzelNiels Wessel<![CDATA[Entropy, Vol. 17, Pages 2919-2931: Three-Stage Quantum Cryptography Protocol under Collective-Rotation Noise]]>
http://mdpi.com/1099-4300/17/5/2919
Information security is increasingly important as society migrates to the information age. Classical cryptography widely used nowadays is based on computational complexity, which means that it assumes that solving some particular mathematical problems is hard on a classical computer. With the development of supercomputers and, potentially, quantum computers, classical cryptography has more and more potential risks. Quantum cryptography provides a solution which is based on the Heisenberg uncertainty principle and no-cloning theorem. While BB84-based quantum protocols are only secure when a single photon is used in communication, the three-stage quantum protocol is multi-photon tolerant. However, existing analyses assume perfect noiseless channels. In this paper, a multi-photon analysis is performed for the three-stage quantum protocol under the collective-rotation noise model. The analysis provides insights into the impact of the noise level on a three-stage quantum cryptography system.Entropy2015-05-07175Article10.3390/e17052919291929311099-43002015-05-07doi: 10.3390/e17052919Linsen WuYuhua Chen<![CDATA[Entropy, Vol. 17, Pages 2895-2918: AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems]]>
http://mdpi.com/1099-4300/17/5/2895
In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular “action at a distance” is termed allostery. Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system’s underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor.Entropy2015-05-07175Article10.3390/e17052895289529181099-43002015-05-07doi: 10.3390/e17052895Michael LeVineHarel Weinstein<![CDATA[Entropy, Vol. 17, Pages 2876-2894: Properties of Nonnegative Hermitian Matrices and New Entropic Inequalities for Noncomposite Quantum Systems]]>
http://mdpi.com/1099-4300/17/5/2876
We consider the probability distributions, spin (qudit)-state tomograms and density matrices of quantum states, and their information characteristics, such as Shannon and von Neumann entropies and q-entropies, from the viewpoints of both well-known purely mathematical features of nonnegative numbers and nonnegative matrices and their physical characteristics, such as entanglement and other quantum correlation phenomena. We review entropic inequalities such as the Araki–Lieb inequality and the subadditivity and strong subadditivity conditions known for bipartite and tripartite systems, and recently obtained for single qudit states. We present explicit matrix forms of the known and some new entropic inequalities associated with quantum states of composite and noncomposite systems. We discuss the tomographic probability distributions of qudit states and demonstrate the inequalities for tomographic entropies of the qudit states. In addition, we mention a possibility to use the discussed information properties of single qudit states in quantum technologies based on multilevel atoms and quantum circuits produced of Josephson junctions.Entropy2015-05-06175Review10.3390/e17052876287628941099-43002015-05-06doi: 10.3390/e17052876Margarita Man'koVladimir Man'ko<![CDATA[Entropy, Vol. 17, Pages 2862-2875: Stabilization Effects of Dichotomous Noise on the Lifetime of theSuperconducting State in a Long Josephson Junction]]>
http://mdpi.com/1099-4300/17/5/2862
We investigate the superconducting lifetime of a long overdamped current-biasedJosephson junction, in the presence of telegraph noise sources. The analysis is performed byrandomly choosing the initial condition for the noise source. However, in order to investigatehow the initial value of the dichotomous noise affects the phase dynamics, we extend ouranalysis using two different fixed initial values for the source of random fluctuations. In ourstudy, the phase dynamics of the Josephson junction is analyzed as a function of the noisesignal intensity, for different values of the parameters of the system and external drivingcurrents. We find that the mean lifetime of the superconductive metastable state as a functionof the noise intensity is characterized by nonmonotonic behavior, strongly related to thesoliton dynamics during the switching towards the resistive state. The role of the correlationtime of the noise source is also taken into account. Noise-enhanced stability is observed inthe investigated system.Entropy2015-05-06175Article10.3390/e17052862286228751099-43002015-05-06doi: 10.3390/e17052862Claudio GuarcelloDavide ValentiAngelo CarolloBernardo Spagnolo<![CDATA[Entropy, Vol. 17, Pages 2853-2861: Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems]]>
http://mdpi.com/1099-4300/17/5/2853
It is by now well known that the Boltzmann-Gibbs-von Neumann-Shannon logarithmic entropic functional (\(S_{BG}\)) is inadequate for wide classes of strongly correlated systems: see for instance the 2001 Brukner and Zeilinger's {\it Conceptual inadequacy of the Shannon information in quantum measurements}, among many other systems exhibiting various forms of complexity. On the other hand, the Shannon and Khinchin axioms uniquely mandate the BG form \(S_{BG}=-k\sum_i p_i \ln p_i\); the Shore and Johnson axioms follow the same path. Many natural, artificial and social systems have been satisfactorily approached with nonadditive entropies such as the \(S_q=k \frac{1-\sum_i p_i^q}{q-1}\) one (\(q \in {\cal R}; \,S_1=S_{BG}\)), basis of nonextensive statistical mechanics. Consistently, the Shannon 1948 and Khinchine 1953 uniqueness theorems have already been generalized in the literature, by Santos 1997 and Abe 2000 respectively, in order to uniquely mandate \(S_q\). We argue here that the same remains to be done with the Shore and Johnson 1980 axioms. We arrive to this conclusion by analyzing specific classes of strongly correlated complex systems that await such generalization.Entropy2015-05-05175Article10.3390/e17052853285328611099-43002015-05-05doi: 10.3390/e17052853Constantino Tsallis<![CDATA[Entropy, Vol. 17, Pages 2834-2852: A Novel Risk Metric for Staff Turnover in a Software Project Based on Information Entropy]]>
http://mdpi.com/1099-4300/17/5/2834
Staff turnover in a software project is a significant risk that can result in project failure. Despite the urgency of this issue, however, relevant studies are limited and are mostly qualitative; quantitative studies are extremely rare. This paper proposes a novel risk metric for staff turnover in a software project based on the information entropy theory. To address the gaps of existing studies, five aspects are considered, namely, staff turnover probability, turnover type, staff level, software project complexity, and staff order degree. This paper develops a method of calculating staff turnover risk probability in a software project based on the field, equity, and goal congruence theories. The proposed method prevents the probability of subjective estimation. It is more objective and comprehensive and superior than existing research. This paper not only presents a detailed operable model, but also theoretically demonstrates the scientificity and rationality of the research. The case study performed in this study indicates that the approach is reasonable, effective, and feasible.Entropy2015-05-04175Article10.3390/e17052834283428521099-43002015-05-04doi: 10.3390/e17052834Rong Jiang<![CDATA[Entropy, Vol. 17, Pages 2812-2833: On the κ-Deformed Cyclic Functions and the Generalized Fourier Series in the Framework of the κ-Algebra]]>
http://mdpi.com/1099-4300/17/5/2812
We explore two possible generalizations of the Euler formula for the complex \(\kappa\)-exponential, which give two different sets of \(\kappa\)-deformed cyclic functions endowed with different analytical properties. In a case, the \(\kappa\)-sine and \(\kappa\)-cosine functions take real values on \(\Re\) and are characterized by an asymptotic log-periodic behavior. In the other case, the \(\kappa\)-cyclic functions take real values only in the region \(|x|\leq1/|\kappa|\), while, for \(|x|&gt;1/|\kappa|\), they assume purely imaginary values with an increasing modulus. However, the main mathematical properties of the standard cyclic functions, opportunely reformulated in the formalism of the \(\kappa\)-mathematics, are fulfilled by the two sets of the \(\kappa\)-trigonometric functions. In both cases, we study the orthogonality and the completeness relations and introduce their respective generalized Fourier series for square integrable functions.Entropy2015-05-04175Article10.3390/e17052812281228331099-43002015-05-04doi: 10.3390/e17052812Antonio Scarfone<![CDATA[Entropy, Vol. 17, Pages 2781-2811: The Grading Entropy-based Criteria for Structural Stability of Granular Materials and Filters]]>
http://mdpi.com/1099-4300/17/5/2781
This paper deals with three grading entropy-based rules that describe different soil structure stability phenomena: an internal stability rule, a filtering rule and a segregation rule. These rules are elaborated on the basis of a large amount of laboratory testing and from existing knowledge in the field. Use is made of the theory of grading entropy to derive parameters which incorporate all of the information of the grading curve into a pair of entropy-based parameters that allow soils with common behaviours to be grouped into domains on an entropy diagram. Applications of the derived entropy-based rules are presented by examining the reason of a dam failure, by testing against the existing filter rules from the literature, and by giving some examples for the design of non-segregating grading curves (discrete particle size distributions by dry weight). A physical basis for the internal stability rule is established, wherein the higher values of base entropy required for granular stability are shown to reflect the closeness between the mean and maximum grain diameters, which explains how there are sufficient coarser grains to achieve a stable grain skeleton.Entropy2015-05-04175Article10.3390/e17052781278128111099-43002015-05-04doi: 10.3390/e17052781Janos LőrinczEmöke ImreStephen FityusPhong TrangTibor TarnaiIstván TalataVijay Singh<![CDATA[Entropy, Vol. 17, Pages 2764-2780: Estimating the Lower Limit of the Impact of Amines on Nucleation in the Earth’s Atmosphere]]>
http://mdpi.com/1099-4300/17/5/2764
Amines, organic derivatives of NH3, are important common trace atmospheric species that can enhance new particle formation in the Earth’s atmosphere under favorable conditions. While methylamine (MA), dimethylamine (DMA) and trimethylamine (TMA) all efficiently enhance binary nucleation, MA may represent the lower limit of the enhancing effect of amines on atmospheric nucleation. In the present paper, we report new thermochemical data concerning MA-enhanced nucleation, which were obtained using the DFT PW91PW91/6-311++G (3df, 3pd) method, and investigate the enhancement in production of stable pre-nucleation clusters due to the MA. We found that the MA ternary nucleation begins to dominate over ternary nucleation of sulfuric acid, water and ammonia at [MA]/[NH3] &gt; ~10−3. This means that under real atmospheric conditions ([MA] ~ 1 ppt, [NH3] ~ 1 ppb) the lower limit of the enhancement due to methylamines is either close to or higher than the typical effect of NH3. A very strong impact of the MA is observed at low RH; however it decreases quickly as the RH grows. Low RH and low ambient temperatures were found to be particularly favorable for the enhancement in production of stable sulfuric acid-water clusters due to the MA.Entropy2015-04-30175Article10.3390/e17052764276427801099-43002015-04-30doi: 10.3390/e17052764Alexey NadyktoJason HerbFangqun YuYisheng XuEkaterina Nazarenko<![CDATA[Entropy, Vol. 17, Pages 2749-2763: Detection of Changes in Ground-Level Ozone Concentrations via Entropy]]>
http://mdpi.com/1099-4300/17/5/2749
Ground-level ozone concentration is a key indicator of air quality. Theremay exist sudden changes in ozone concentration data over a long time horizon, which may be caused by the implementation of government regulations and policies, such as establishing exhaust emission limits for on-road vehicles. To monitor and assess the efficacy of these policies, we propose a methodology for detecting changes in ground-level ozone concentrations, which consists of three major steps: data transformation, simultaneous autoregressive modelling and change-point detection on the estimated entropy. To show the effectiveness of the proposed methodology, the methodology is applied to detect changes in ground-level ozone concentration data collected in the Toronto region of Canada between June and September for the years from 1988 to 2009. The proposed methodology is also applicable to other climate data.Entropy2015-04-30175Article10.3390/e17052749274927631099-43002015-04-30doi: 10.3390/e17052749Yuehua WuBaisuo JinElton Chan<![CDATA[Entropy, Vol. 17, Pages 2741-2748: Synthesis and Surface Thermodynamic Functions of CaMoO4 Nanocakes]]>
http://mdpi.com/1099-4300/17/5/2741
CaMoO4 nanocakes with uniform size and morphology were prepared on a large scale via a room temperature reverse-microemulsion method. The products were characterized in detail by X-ray powder diffraction, field-emission scanning electron microscopy, transmission electron microscopy, and high-resolution transmission electron microscopy. By establishing the relations between the thermodynamic functions of nano-CaMoO4 and bulk-CaMoO4 reaction systems, the equations for calculating the surface thermodynamic functions of nano-CaMoO4 were derived. Then, combined with in-situ microcalorimetry, the molar surface enthalpy, molar surface Gibbs free energy, and molar surface entropy of the prepared CaMoO4 nanocakes at 298.15 K were successfully obtained as (19.674 ± 0.017) kJ·mol−1, (619.704 ± 0.016) J·mol−1, and (63.908 ± 0.057) J·mol−1·K−1, respectively.Entropy2015-04-30175Article10.3390/e17052741274127481099-43002015-04-30doi: 10.3390/e17052741Xingxing LiGaochao FanZaiyin Huang<![CDATA[Entropy, Vol. 17, Pages 2723-2740: Finite Key Size Analysis of Two-Way Quantum Cryptography]]>
http://mdpi.com/1099-4300/17/5/2723
Quantum cryptographic protocols solve the longstanding problem of distributing a shared secret string to two distant users by typically making use of one-way quantum channel. However, alternative protocols exploiting two-way quantum channel have been proposed for the same goal and with potential advantages. Here, we overview a security proof for two-way quantum key distribution protocols, against the most general eavesdropping attack, that utilize an entropic uncertainty relation. Then, by resorting to the “smooth” version of involved entropies, we extend such a proof to the case of finite key size. The results will be compared to those available for one-way protocols showing some advantages.Entropy2015-04-30175Article10.3390/e17052723272327401099-43002015-04-30doi: 10.3390/e17052723Jesni ShaariStefano Mancini<![CDATA[Entropy, Vol. 17, Pages 2706-2722: Identifying the Most Relevant Lag with Runs]]>
http://mdpi.com/1099-4300/17/5/2706
In this paper, we propose a nonparametric statistical tool to identify the most relevant lag in the model description of a time series. It is also shown that it can be used for model identification. The statistic is based on the number of runs, when the time series is symbolized depending on the empirical quantiles of the time series. With a Monte Carlo simulation, we show the size and power performance of our new test statistic under linear and nonlinear data generating processes. From the theoretical point of view, it is the first time that symbolic analysis and runs are proposed to identifying characteristic lags and also to help in the identification of univariate time series models. From a more applied point of view, the results show the power and competitiveness of the proposed tool with respect to other techniques without presuming or specifying a model.Entropy2015-04-28175Article10.3390/e17052706270627221099-43002015-04-28doi: 10.3390/e17052706Úrsula FauraMatilde LafuenteMariano Matilla-GarcíaManuel Ruiz<![CDATA[Entropy, Vol. 17, Pages 2688-2705: A Fuzzy Logic-Based Approach for Estimation of Dwelling Times of Panama Metro Stations]]>
http://mdpi.com/1099-4300/17/5/2688
Passenger flow modeling and station dwelling time estimation are significant elements for railway mass transit planning, but system operators usually have limited information to model the passenger flow. In this paper, an artificial-intelligence technique known as fuzzy logic is applied for the estimation of the elements of the origin-destination matrix and the dwelling time of stations in a railway transport system. The fuzzy inference engine used in the algorithm is based in the principle of maximum entropy. The approach considers passengers’ preferences to assign a level of congestion in each car of the train in function of the properties of the station platforms. This approach is implemented to estimate the passenger flow and dwelling times of the recently opened Line 1 of the Panama Metro. The dwelling times obtained from the simulation are compared to real measurements to validate the approach.Entropy2015-04-27175Article10.3390/e17052688268827051099-43002015-04-27doi: 10.3390/e17052688Aranzazu Berbey AlvarezFernando MerchanFrancisco Calvo PoyoRony Caballero George<![CDATA[Entropy, Vol. 17, Pages 2677-2687: Projective Synchronization of Chaotic Discrete Dynamical Systems via Linear State Error Feedback Control]]>
http://mdpi.com/1099-4300/17/5/2677
A projective synchronization scheme for a kind of n-dimensional discrete dynamical system is proposed by means of a linear feedback control technique. The scheme consists of master and slave discrete dynamical systems coupled by linear state error variables. A kind of novel 3-D chaotic discrete system is constructed, to which the test for chaos is applied. By using the stability principles of an upper or lower triangular matrix, two controllers for achieving projective synchronization are designed and illustrated with the novel systems. Lastly some numerical simulations are employed to validate the effectiveness of the proposed projective synchronization scheme.Entropy2015-04-27175Article10.3390/e17052677267726871099-43002015-04-27doi: 10.3390/e17052677Baogui XinZhiheng Wu<![CDATA[Entropy, Vol. 17, Pages 2655-2676: State Feedback with Memory for Constrained Switched Positive Linear Systems]]>
http://mdpi.com/1099-4300/17/5/2655
In this paper, the stabilization problem in switched linear systems with time-varying delay under constrained state and control is investigated. The synthesis of bounded state-feedback controllers with memory ensures that a closed-loop state is positive and stable. Firstly, synthesis with a sign-restricted (nonnegative and negative) control is considered for general switched systems; then, the stabilization issue under bounded controls including the asymmetrically bounded controls and states constraints are addressed. In addition, the results are extended to systems with interval and polytopic uncertainties. All the proposed conditions are solvable in term of linear programming. Numerical examples illustrate the applicability of the results.Entropy2015-04-27175Article10.3390/e17052655265526761099-43002015-04-27doi: 10.3390/e17052655Jinjin LiuKanjian Zhang<![CDATA[Entropy, Vol. 17, Pages 2642-2654: Stochastic Processes via the Pathway Model]]>
http://mdpi.com/1099-4300/17/5/2642
After collecting data from observations or experiments, the next step is to analyze the data to build an appropriate mathematical or stochastic model to describe the data so that further studies can be done with the help of the model. In this article, the input-output type mechanism is considered first, where reaction, diffusion, reaction-diffusion, and production-destruction type physical situations can fit in. Then techniques are described to produce thicker or thinner tails (power law behavior) in stochastic models. Then the pathway idea is described where one can switch to different functional forms of the probability density function through a parameter called the pathway parameter. The paper is a continuation of related solar neutrino research published previously in this journal.Entropy2015-04-24175Article10.3390/e17052642264226541099-43002015-04-24doi: 10.3390/e17052642Arak MathaiHans Haubold<![CDATA[Entropy, Vol. 17, Pages 2624-2641: Recurrence Plot Based Damage Detection Method by Integrating Control Chart]]>
http://mdpi.com/1099-4300/17/5/2624
Because of the importance of damage detection in manufacturing systems and other areas, many fault detection methods have been developed that are based on a vibration signal. Little work, however, has been reported in the literature on using a recurrence plot method to analyze the vibration signal for damage detection. In this paper, we develop a recurrence plot based fault detection method by integrating the statistical process control technique. The recurrence plots of the vibration signals are derived by using the recurrence plot (RP) method. Five types of features are extracted from the recurrence plots to quantify the vibration signals’ characteristic. Then, the control chart, a multivariate statistical process control technique, is used to monitor these features. The control chart technique, however, has the assumption that all the data should follow a normal distribution. The RP based bootstrap control chart is proposed to estimate the control chart parameters. The performance of the proposed RP based bootstrap control chart is evaluated by a simulation study and compared with other univariate bootstrap control charts based on recurrence plot features. A real case study of rolling element bearing fault detection demonstrates that the proposed fault detection method achieves a very good performance.Entropy2015-04-24175Article10.3390/e17052624262426411099-43002015-04-24doi: 10.3390/e17052624Cheng ZhouWeidong Zhang<![CDATA[Entropy, Vol. 17, Pages 2606-2623: Uncovering Discrete Non-Linear Dependence with Information Theory]]>
http://mdpi.com/1099-4300/17/5/2606
In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis.Entropy2015-04-23175Article10.3390/e17052606260626231099-43002015-04-23doi: 10.3390/e17052606Anton GolubGregor ChliamovitchAlexandre DupuisBastien Chopard<![CDATA[Entropy, Vol. 17, Pages 2590-2605: Entropy and Recurrence Measures of a Financial Dynamic System by an Interacting Voter System]]>
http://mdpi.com/1099-4300/17/5/2590
A financial time series agent-based model is reproduced and investigated by the statistical physics system, the finite-range interacting voter system. The voter system originally describes the collective behavior of voters who constantly update their positions on a particular topic, which is a continuous-time Markov process. In the proposed model, the fluctuations of stock price changes are attributed to the market information interaction amongst the traders and certain similarities of investors’ behaviors. Further, the complexity of return series of the financial model is studied in comparison with two real stock indexes, the Shanghai Stock Exchange Composite Index and the Hang Seng Index, by composite multiscale entropy analysis and recurrence analysis. The empirical research shows that the simulation data for the proposed model could grasp some natural features of actual markets to some extent.Entropy2015-04-23175Article10.3390/e17052590259026051099-43002015-04-23doi: 10.3390/e17052590Hong-Li NiuJun Wang<![CDATA[Entropy, Vol. 17, Pages 2573-2589: Source Localization by Entropic Inference and Backward Renormalization Group Priors]]>
http://mdpi.com/1099-4300/17/5/2573
A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG) transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG) priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI) experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.Entropy2015-04-23175Article10.3390/e17052573257325891099-43002015-04-23doi: 10.3390/e17052573Nestor Caticha<![CDATA[Entropy, Vol. 17, Pages 2556-2572: Optimum Accelerated Degradation Tests for the Gamma Degradation Process Case under the Constraint of Total Cost]]>
http://mdpi.com/1099-4300/17/5/2556
An accelerated degradation test (ADT) is regarded as an effective alternative to an accelerated life test in the sense that an ADT can provide more accurate information on product reliability, even when few or no failures may be expected before the end of a practical test period. In this paper, statistical methods for optimal designing ADT plans are developed assuming that the degradation characteristic follows a gamma process (GP). The GP-based approach has an advantage that it can deal with more frequently encountered situations in which the degradation should always be nonnegative and strictly increasing over time. The optimal ADT plan is developed under the total experimental cost constraint by determining the optimal settings of variables such as the number of measurements, the measurement times, the test stress levels and the number of units allocated to each stress level such that the asymptotic variance of the maximum likelihood estimator of the q-th quantile of the lifetime distribution at the use condition is minimized. In addition, compromise plans are developed to provide means to check the adequacy of the assumed acceleration model. Finally, sensitivity analysis procedures for assessing the effects of the uncertainties in the pre-estimates of unknown parameters are illustrated with an example.Entropy2015-04-23175Article10.3390/e17052556255625721099-43002015-04-23doi: 10.3390/e17052556Heonsang Lim<![CDATA[Entropy, Vol. 17, Pages 2544-2555: Thermodynamic Analysis of Double-Stage Compression Transcritical CO2 Refrigeration Cycles with an Expander]]>
http://mdpi.com/1099-4300/17/4/2544
Four different double-compression CO2 transcritical refrigeration cycles are studied: double-compression external intercooler cycle (DCEI), double-compression external intercooler cycle with an expander (DCEIE), double-compression flash intercooler cycle (DCFI), double-compression flash intercooler cycle with an expander (DCFIE). The results showed that the optimum gas cooler pressure and optimum intermediate pressure of the flash intercooler cycles are lower than that of the external intercooler cycle. The use of an expander in the DCEI cycle leads to a decrease of the optimum gas cooler pressure and little variation of the optimum intermediate pressure. However, the replacement of the throttle valve with an expander in the DCFI cycle results in little variation of the optimal gas cooler pressure and an increase of the optimum intermediate pressure. The DCFI cycle outperforms the DCEI cycle under all the chosen operating conditions. The DCEIE cycle outperforms the DCFIE cycle when the evaporating temperature exceeds 0 °C or the gas cooler outlet temperature surpasses 35 °C. When the gas cooler exit temperature varies from 32 °C to 48 °C, the DCEI cycle, DCEIE cycle, DCFI cycle and DCFIE cycle yield averaged 4.6%, 29.2%, 12.9% and 22.3% COP improvement, respectively, over the basic cycle.Entropy2015-04-22174Article10.3390/e17042544254425551099-43002015-04-22doi: 10.3390/e17042544Zhenying ZhangLirui TongXingguo Wang<![CDATA[Entropy, Vol. 17, Pages 2459-2543: Justifying Objective Bayesianism on Predicate Languages]]>
http://mdpi.com/1099-4300/17/4/2459
Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.Entropy2015-04-22174Article10.3390/e17042459245925431099-43002015-04-22doi: 10.3390/e17042459Jürgen LandesJon Williamson<![CDATA[Entropy, Vol. 17, Pages 2432-2458: Information Geometry on Complexity and Stochastic Interaction]]>
http://mdpi.com/1099-4300/17/4/2432
Interdependencies of stochastically interacting units are usually quantified by the Kullback-Leibler divergence of a stationary joint probability distribution on the set of all configurations from the corresponding factorized distribution. This is a spatial approach which does not describe the intrinsically temporal aspects of interaction. In the present paper, the setting is extended to a dynamical version where temporal interdependencies are also captured by using information geometry of Markov chain manifolds.Entropy2015-04-21174Article10.3390/e17042432243224581099-43002015-04-21doi: 10.3390/e17042432Nihat Ay<![CDATA[Entropy, Vol. 17, Pages 2409-2431: Collaborative Performance Research on Multi-level Hospital Management Based on Synergy Entropy-HoQ]]>
http://mdpi.com/1099-4300/17/4/2409
Because of the general lack of multi-level hospital management collaboration performance effectiveness research, this paper proposes a multi-level hospital management Synergy Entropy-House of Quality (HoQ) Measurement Model by innovatively combining the House of Quality (HoQ) measure model with a Synergy Entropy computing principle. Triangular fuzzy functions are used to determine the importance degree parameter of each hospital management element which combined with the results from the Synergy Entropy evaluation of the hospital management elements, arrive at a comprehensive collaborative computation result for the various elements, ensuring results objectivity. Finally, the analysis of the collaborative research on multi-level hospital management demonstrated the scientific effectiveness of the hospital management Synergy Entropy-House of Quality (HoQ) Measurement Model.Entropy2015-04-20174Article10.3390/e17042409240924311099-43002015-04-20doi: 10.3390/e17042409Lei ChenXuedong LiangTao Li<![CDATA[Entropy, Vol. 17, Pages 2367-2408: An Entropy-Based Network Anomaly Detection Method]]>
http://mdpi.com/1099-4300/17/4/2367
Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i) preparation of a concept of original entropy-based network anomaly detection method, (ii) implementation of the method, (iii) preparation of original dataset, (iv) evaluation of the method.Entropy2015-04-20174Article10.3390/e17042367236724081099-43002015-04-20doi: 10.3390/e17042367Przemysław BerezińskiBartosz JasiulMarcin Szpyrka<![CDATA[Entropy, Vol. 17, Pages 2355-2366: A Criterion for Topological Close-Packed Phase Formation in High Entropy Alloys]]>
http://mdpi.com/1099-4300/17/4/2355
The stability of topological close-packed (TCP) phases were found to be well related to the average value of the d-orbital energy level \( \overline{Md} \) for most reported high entropy alloys (HEAs). Excluding some HEAs that contain high levels of the elements aluminum and vanadium, the results of this study indicated that the TCP phases form at \( \overline{Md} \) &gt; 1.09. This criterion, as a semi-empirical method, can play a key role in designing and preparing HEAs with high amounts of transitional elements.Entropy2015-04-20174Article10.3390/e17042355235523661099-43002015-04-20doi: 10.3390/e17042355Yiping LuYong DongLi JiangTongmin WangTingju LiYong Zhang<![CDATA[Entropy, Vol. 17, Pages 2341-2354: Multi-State Quantum Dissipative Dynamics in Sub-Ohmic Environment: The Strong Coupling Regime]]>
http://mdpi.com/1099-4300/17/4/2341
We study the dissipative quantum dynamics and the asymptotic behavior of a particle in a bistable potential interacting with a sub-Ohmic broadband environment. The reduced dynamics, in the intermediate to strong dissipation regime, is obtained beyond the two-level system approximation by using a real-time path integral approach. We find a crossover dynamic regime with damped intra-well oscillations and incoherent tunneling and a completely incoherent regime at strong damping. Moreover, a nonmonotonic behavior of the left/right well population difference is found as a function of the damping strength.Entropy2015-04-17174Article10.3390/e17042341234123541099-43002015-04-17doi: 10.3390/e17042341Luca MagazzùDavide ValentiAngelo CarolloBernardo Spagnolo<![CDATA[Entropy, Vol. 17, Pages 2328-2340: Exergy Analysis of a Ground-Coupled Heat Pump Heating System with Different Terminals]]>
http://mdpi.com/1099-4300/17/4/2328
In order to evaluate and improve the performance of a ground-coupled heat pump (GCHP) heating system with radiant floors as terminals, an exergy analysis based on test results is performed in this study. The system is divided into four subsystems, and the exergy loss and exergy efficiency of each subsystem are calculated using the expressions derived based on exergy balance equations. The average values of the measured parameters are used for the exergy analysis. The analysis results show that the two largest exergy losses occur in the heat pump and terminals, with losses of 55.3% and 22.06%, respectively, and the lowest exergy efficiency occurs in the ground heat exchange system. Therefore, GCHP system designers should pay close attention to the selection of heat pumps and terminals, especially in the design of ground heat exchange systems. Compared with the scenario system in which fan coil units (FCUs) are substituted for the radiant floors, the adoption of radiant floors can result in a decrease of 12% in heating load, an increase of 3.24% in exergy efficiency of terminals and an increase of 1.18% in total exergy efficiency of the system. The results may point out the direction and ways of optimizing GCHP systems.Entropy2015-04-17174Article10.3390/e17042328232823401099-43002015-04-17doi: 10.3390/e17042328Xiao ChenXiaoli Hao<![CDATA[Entropy, Vol. 17, Pages 2304-2327: Information-Theoretic Inference of Common Ancestors]]>
http://mdpi.com/1099-4300/17/4/2304
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version.Entropy2015-04-16174Article10.3390/e17042304230423271099-43002015-04-16doi: 10.3390/e17042304Bastian SteudelNihat Ay<![CDATA[Entropy, Vol. 17, Pages 2281-2303: Some Comments on the Entropy-Based Criteria for Piping]]>
http://mdpi.com/1099-4300/17/4/2281
This paper is an extension of previous work which characterises soil behaviours using the grading entropy diagram. The present work looks at the piping process in granular soils, by considering some new data from flood-protection dikes. The piping process is divided into three parts here: particle movement at the micro scale to segregate free water; sand boil development (which is the initiation of the pipe), and pipe growth. In the first part of the process, which occurs during the rising flood, the increase in shear stress along the dike base may cause segregation of water into micro pipes if the subsoil in the dike base is relatively loose. This occurs at the maximum dike base shear stress level (ratio of shear stress and strength) zone which is close to the toe. In the second part of the process, the shear strain increment causes a sudden, asymmetric slide and cracking of the dike leading to the localized excess pore pressure, liquefaction and the formation of a sand boil. In the third part of the process, the soil erosion initiated through the sand boil continues, and the pipe grows. The piping in the Hungarian dikes often occurs in a two-layer system; where the base layer is coarser with higher permeability and the cover layer is finer with lower permeability. The new data presented here show that the soils ejected from the sand boils are generally silty sands and sands, which are prone to both erosion (on the basis of the entropy criterion) and liquefaction. They originate from the cover layer which is basically identical to the soil used in the Dutch backward erosion experiments.Entropy2015-04-15174Article10.3390/e17042281228123031099-43002015-04-15doi: 10.3390/e17042281Emöke ImreLaszlo NagyJanos LőrinczNegar RahemiTom SchanzVijay SinghStephen Fityus<![CDATA[Entropy, Vol. 17, Pages 2253-2280: Integrating Entropy and Copula Theories for Hydrologic Modeling and Analysis]]>
http://mdpi.com/1099-4300/17/4/2253
Entropy is a measure of uncertainty and has been commonly used for various applications, including probability inferences in hydrology. Copula has been widely used for constructing joint distributions to model the dependence structure of multivariate hydrological random variables. Integration of entropy and copula theories provides new insights in hydrologic modeling and analysis, for which the development and application are still in infancy. Two broad branches of integration of the two concepts, entropy copula and copula entropy, are introduced in this study. On the one hand, the entropy theory can be used to derive new families of copulas based on information content matching. On the other hand, the copula entropy provides attractive alternatives in the nonlinear dependence measurement even in higher dimensions. We introduce in this study the integration of entropy and copula theories in the dependence modeling and analysis to illustrate the potential applications in hydrology and water resources.Entropy2015-04-15174Review10.3390/e17042253225322801099-43002015-04-15doi: 10.3390/e17042253Zengchao HaoVijay Singh<![CDATA[Entropy, Vol. 17, Pages 2228-2252: A Community-Based Approach to Identifying Influential Spreaders]]>
http://mdpi.com/1099-4300/17/4/2228
Identifying influential spreaders in complex networks has a significant impact on understanding and control of spreading process in networks. In this paper, we introduce a new centrality index to identify influential spreaders in a network based on the community structure of the network. The community-based centrality (CbC) considers both the number and sizes of communities that are directly linked by a node. We discuss correlations between CbC and other classical centrality indices. Based on simulations of the single source of infection with the Susceptible-Infected-Recovered (SIR) model, we find that CbC can help to identify some critical influential nodes that other indices cannot find. We also investigate the stability of CbC.Entropy2015-04-14174Article10.3390/e17042228222822521099-43002015-04-14doi: 10.3390/e17042228Zhiying ZhaoXiaofan WangWei ZhangZhiliang Zhu<![CDATA[Entropy, Vol. 17, Pages 2218-2227: Cryptographic Aspects of Quantum Reading]]>
http://mdpi.com/1099-4300/17/4/2218
Besides achieving secure communication between two spatially-separated parties,another important issue in modern cryptography is related to secure communication intime, i.e., the possibility to confidentially store information on a memory for later retrieval.Here we explore this possibility in the setting of quantum reading, which exploits quantumentanglement to efficiently read data from a memory whereas classical strategies (e.g., basedon coherent states or their mixtures) cannot retrieve any information. From this point ofview, the technique of quantum reading can provide a new form of technological security fordata storage.Entropy2015-04-13174Article10.3390/e17042218221822271099-43002015-04-13doi: 10.3390/e17042218Gaetana Spedalieri<![CDATA[Entropy, Vol. 17, Pages 2198-2217: Entropic-Skins Geometry to Describe Wall Turbulence Intermittency]]>
http://mdpi.com/1099-4300/17/4/2198
In order to describe the phenomenon of intermittency in wall turbulence and, more particularly, the behaviour of moments and and intermittency exponents ζP with the order p and distance to the wall, we developed a new geometrical framework called “entropic-skins geometry” based on the notion of scale-entropy which is here applied to an experimental database of boundary layer flows. Each moment has its own spatial multi-scale support Ωp (“skin”). The model assumes the existence of a hierarchy of multi-scale sets Ωp ranged from the “bulk” to the “crest”. The crest noted characterizes the geometrical support where the most intermittent (the highest) fluctuations in energy dissipation occur; the bulk is the geometrical support for the whole range of fluctuations. The model assumes then the existence of a dynamical flux through the hierarchy of skins. The specific case where skins display a fractal structure is investigated. Bulk fractal dimension and crest dimension are linked by a scale-entropy flux defining a reversibility efficiency (d is the embedding dimension). The model, initially developed for homogeneous and isotropic turbulent flows, is applied here to wall bounded turbulence where intermittency exponents are measured by extended self-similarity. We obtained for intermittency exponents the analytical expression with γ ≈ 0.36 in agreement with experimental results.Entropy2015-04-13174Article10.3390/e17042198219822171099-43002015-04-13doi: 10.3390/e17042198Diogo Queiros-CondeJohan CarlierLavinia GrosuMichel Stanislas<![CDATA[Entropy, Vol. 17, Pages 2184-2197: Implications of Non-Differentiable Entropy on a Space-Time Manifold]]>
http://mdpi.com/1099-4300/17/4/2184
Assuming that the motions of a complex system structural units take place on continuous, but non-differentiable curves of a space-time manifold, the scale relativity model with arbitrary constant fractal dimension (the hydrodynamic and wave function versions) is built. For non-differentiability through stochastic processes of the Markov type, the non-differentiable entropy concept on a space-time manifold in the hydrodynamic version and its correspondence with motion variables (energy, momentum, etc.) are established. Moreover, for the same non-differentiability type, through a scale resolution dependence of a fundamental length and wave function independence with respect to the proper time, a non-differentiable Klein–Gordon-type equation in the wave function version is obtained. For a phase-amplitude functional dependence on the wave function, the non-differentiable spontaneous symmetry breaking mechanism implies pattern generation in the form of Cooper non-differentiable-type pairs, while its non-differentiable topology implies some fractal logic elements (fractal bit, fractal gates, etc.).Entropy2015-04-13174Article10.3390/e17042184218421971099-43002015-04-13doi: 10.3390/e17042184Maricel AgopAlina GavriluţGavril ŞtefanBogdan Doroftei<![CDATA[Entropy, Vol. 17, Pages 2170-2183: High-Speed Spindle Fault Diagnosis with the Empirical Mode Decomposition and Multiscale Entropy Method]]>
http://mdpi.com/1099-4300/17/4/2170
The root mean square (RMS) value of a vibration signal is an important indicator used to represent the amplitude of vibrations in evaluating the quality of high-speed spindles. However, RMS is unable to detect a number of common fault characteristics that occur prior to bearing failure. Extending the operational life and quality of spindles requires reliable fault diagnosis techniques for the analysis of vibration signals from three axes. This study used empirical mode decomposition to decompose signals into intrinsic mode functions containing a zero-crossing rate and energy to represent the characteristics of rotating elements. The MSE curve was then used to identify a number of characteristic defects. The purpose of this research was to obtain vibration signals along three axes with the aim of extending the operational life of devices included in the product line of an actual spindle manufacturing company.Entropy2015-04-13174Article10.3390/e17042170217021831099-43002015-04-13doi: 10.3390/e17042170Nan-Kai HsiehWei-Yen LinHong-Tsu Young<![CDATA[Entropy, Vol. 17, Pages 2140-2169: Deep Belief Network-Based Approaches for Link Prediction in Signed Social Networks]]>
http://mdpi.com/1099-4300/17/4/2140
In some online social network services (SNSs), the members are allowed to label their relationships with others, and such relationships can be represented as the links with signed values (positive or negative). The networks containing such relations are named signed social networks (SSNs), and some real-world complex systems can be also modeled with SSNs. Given the information of the observed structure of an SSN, the link prediction aims to estimate the values of the unobserved links. Noticing that most of the previous approaches for link prediction are based on the members’ similarity and the supervised learning method, however, research work on the investigation of the hidden principles that drive the behaviors of social members are rarely conducted. In this paper, the deep belief network (DBN)-based approaches for link prediction are proposed. Including an unsupervised link prediction model, a feature representation method and a DBN-based link prediction method are introduced. The experiments are done on the datasets from three SNSs (social networking services) in different domains, and the results show that our methods can predict the values of the links with high performance and have a good generalization ability across these datasets.Entropy2015-04-10174Article10.3390/e17042140214021691099-43002015-04-10doi: 10.3390/e17042140Feng LiuBingquan LiuChengjie SunMing LiuXiaolong Wang<![CDATA[Entropy, Vol. 17, Pages 2117-2139: Image Encryption Using Chebyshev Map and Rotation Equation]]>
http://mdpi.com/1099-4300/17/4/2117
We propose a novel image encryption algorithm based on two pseudorandom bit generators: Chebyshev map based and rotation equation based. The first is used for permutation, and the second one for substitution operations. Detailed security analysis has been provided on the novel image encryption algorithm using visual testing, key space evaluation, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and computational and complexity analysis. Based on the theoretical and empirical results the novel image encryption scheme demonstrates an excellent level of security.Entropy2015-04-09174Article10.3390/e17042117211721391099-43002015-04-09doi: 10.3390/e17042117Borislav StoyanovKrasimir Kordov<![CDATA[Entropy, Vol. 17, Pages 2094-2116: Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR) and Information Entropy]]>
http://mdpi.com/1099-4300/17/4/2094
Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL), Shili (SL) and Renli (RL), so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.Entropy2015-04-08174Article10.3390/e17042094209421161099-43002015-04-08doi: 10.3390/e17042094Rong Jiang<![CDATA[Entropy, Vol. 17, Pages 2082-2093: Target Detection and Ranging through Lossy Media using Chaotic Radar]]>
http://mdpi.com/1099-4300/17/4/2082
A chaotic radar system has been developed for through-wall detection and ranging of targets. The chaotic signal generated by an improved Colpitts oscillator is designed as a probe signal. Ranging to target is achieved by the cross-correlation between the time-delayed reflected return signal and the replica of the transmitted chaotic signal. In this paper, we explore the performance of the chaotic radar system for target detection and ranging through lossy media. Experimental results show that the designed chaotic radar has the advantages of high range resolution, unambiguous correlation profile, and can be used for through wall target detection and sensing.Entropy2015-04-08174Article10.3390/e17042082208220931099-43002015-04-08doi: 10.3390/e17042082Bingjie WangHang XuPeng YangLi LiuJingxia Li<![CDATA[Entropy, Vol. 17, Pages 2062-2081: Kappa and q Indices: Dependence on the Degrees of Freedom]]>
http://mdpi.com/1099-4300/17/4/2062
The kappa distributions, or their equivalent, the q-exponential distributions, are the natural generalization of the classical Boltzmann-Maxwell distributions, applied to the study of the particle populations in collisionless space plasmas. A huge step in the development of the theory of kappa distributions and their applications in space plasma physics has been achieved with the discovery that the observed kappa distributions are connected with the solid statistical background of non-extensive statistical mechanics. Now that the statistical framework has been identified, it is straightforward to improve our understanding of the nature of the kappa index (or the entropic q-index) that governs these distributions. One critical topic is the dependence of the kappa index on the degrees of freedom. In this paper, we first show how this specific dependence is naturally emerged, using the formalism of the N-particle kappa distribution of velocities. Then, the result is extended in the presence of potential energies. It is shown that the kappa index is simply related to the kinetic and potential degrees of freedom. In addition, it is shown that various problems of non-extensive statistical mechanics, such as (i) the correlation dependence on the total number of particles; and (ii) the normalization divergence for finite kappa indices, are resolved considering the kappa index dependence on the degrees of freedom.Entropy2015-04-08174Article10.3390/e17042062206220811099-43002015-04-08doi: 10.3390/e17042062George Livadiotis<![CDATA[Entropy, Vol. 17, Pages 2039-2061: Experimental and Thermoeconomic Analysis of Small-Scale Solar Organic Rankine Cycle (SORC) System]]>
http://mdpi.com/1099-4300/17/4/2039
A small-scale solar organic Rankine cycle (ORC) is a promising renewable energy-driven power generation technology that can be used in the rural areas of developing countries. A prototype was developed and tested for its performance characteristics under a range of solar source temperatures. The solar ORC system power output was calculated based on the thermal and solar collector efficiency. The maximum solar power output was observed in April. The solar ORC unit power output ranged from 0.4 kW to 1.38 kW during the year. The highest power output was obtained when the expander inlet pressure was 13 bar and the solar source temperature was 120 °C. The area of the collector for the investigation was calculated based on the meteorological conditions of Busan City (South Korea). In the second part, economic and thermoeconomic analyses were carried out to determine the cost of energy per kWh from the solar ORC. The selling price of electricity generation was found to be $0.68/kWh and $0.39/kWh for the prototype and low cost solar ORC, respectively. The sensitivity analysis was carried out in order to find the influencing economic parameters for the change in NPV. Finally, the sustainability index was calculated to assess the sustainable development of the solar ORC system.Entropy2015-04-07174Article10.3390/e17042039203920611099-43002015-04-07doi: 10.3390/e17042039Suresh BaralDokyun KimEunkoo YunKyung Kim<![CDATA[Entropy, Vol. 17, Pages 2025-2038: A Method to Derive the Definition of Generalized Entropy from Generalized Exergy for Any State in Many-Particle Systems]]>
http://mdpi.com/1099-4300/17/4/2025
The literature reports the proofs that entropy is an inherent property of any system in any state and governs thermal energy, which depends on temperature and is transferred by heat interactions. A first novelty proposed in the present study is that mechanical energy, determined by pressure and transferred by work interactions, is also characterized by the entropy property. The second novelty is that a generalized definition of entropy relating to temperature, chemical potential and pressure of many-particle systems, is established to calculate the thermal, chemical and mechanical entropy contribution due to heat, mass and work interactions. The expression of generalized entropy is derived from generalized exergy, which in turn depends on temperature, chemical potential and pressure of the system, and by the entropy-exergy relationship constituting the basis of the method adopted to analyze the available energy and its transfer interactions with a reference system which may be external or constitute a subsystem. This method is underpinned by the Second Law statement enunciated in terms of existence and uniqueness of stable equilibrium for each value of energy content of the system. The equality of chemical potential and equality of pressure are assumed, in addition to equality of temperature, to be necessary conditions for stable equilibrium.Entropy2015-04-07174Article10.3390/e17042025202520381099-43002015-04-07doi: 10.3390/e17042025Pierfrancesco Palazzo<![CDATA[Entropy, Vol. 17, Pages 2010-2024: Resource Requirements and Speed versus Geometry of Unconditionally Secure Physical Key Exchanges]]>
http://mdpi.com/1099-4300/17/4/2010
The imperative need for unconditional secure key exchange is expounded by the increasing connectivity of networks and by the increasing number and level of sophistication of cyberattacks. Two concepts that are theoretically information-secure are quantum key distribution (QKD) and Kirchoff-Law-Johnson-Noise (KLJN). However, these concepts require a dedicated connection between hosts in peer-to-peer (P2P) networks which can be impractical and or cost prohibitive. A practical and cost effective method is to have each host share their respective cable(s) with other hosts such that two remote hosts can realize a secure key exchange without the need of an additional cable or key exchanger. In this article we analyze the cost complexities of cable, key exchangers, and time required in the star network. We mentioned the reliability of the star network and compare it with other network geometries. We also conceived a protocol and equation for the number of secure bit exchange periods needed in a star network. We then outline other network geometries and trade-off possibilities that seem interesting to explore.Entropy2015-04-03174Article10.3390/e17042010201020241099-43002015-04-03doi: 10.3390/e17042010Elias GonzalezRobert BalogLaszlo Kish<![CDATA[Entropy, Vol. 17, Pages 1971-2009: Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Abt. II, LXXVI 1877, pp 373-435 (Wien. Ber. 1877, 76:373-435). Reprinted in Wiss. Abhandlungen, Vol. II, reprint 42, p. 164-223, Barth, Leipzig, 1909]]>
http://mdpi.com/1099-4300/17/4/1971
Translation of the seminal 1877 paper by Ludwig Boltzmann which for the first time established the probabilistic basis of entropy. Includes a scientific commentary.Entropy2015-04-02174Article10.3390/e17041971197120091099-43002015-04-02doi: 10.3390/e17041971Kim SharpFranz Matschinsky<![CDATA[Entropy, Vol. 17, Pages 1958-1970: Assessing Coupling Dynamics from an Ensemble of Time Series]]>
http://mdpi.com/1099-4300/17/4/1958
Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts), which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.Entropy2015-04-02174Article10.3390/e17041958195819701099-43002015-04-02doi: 10.3390/e17041958Germán Gómez-HerreroWei WuKalle RutanenMiguel SorianoGordon PipaRaul Vicente<![CDATA[Entropy, Vol. 17, Pages 1946-1957: A Simple Decoder for Topological Codes]]>
http://mdpi.com/1099-4300/17/4/1946
Here we study an efficient algorithm for decoding topological codes. It is a simple form of HDRG decoder, which could be straightforwardly generalized to complex decoding problems. Specific results are obtained for the planar code with both i.i.d. and spatially correlated errors. The method is shown to compare well with existing ones, despite its simplicity.Entropy2015-04-01174Article10.3390/e17041946194619571099-43002015-04-01doi: 10.3390/e17041946James Wootton<![CDATA[Entropy, Vol. 17, Pages 1936-1945: On Nonlinear Complexity and Shannon’s Entropy of Finite Length Random Sequences]]>
http://mdpi.com/1099-4300/17/4/1936
Pseudorandom binary sequences have important uses in many fields, such as spread spectrum communications, statistical sampling and cryptography. There are two kinds of method in evaluating the properties of sequences, one is based on the probability measure, and the other is based on the deterministic complexity measures. However, the relationship between these two methods still remains an interesting open problem. In this paper, we mainly focus on the widely used nonlinear complexity of random sequences, study on its distribution, expectation and variance of memoryless sources. Furthermore, the relationship between nonlinear complexity and Shannon’s entropy is also established here. The results show that the Shannon’s entropy is strictly monotonically decreased with nonlinear complexity.Entropy2015-04-01174Article10.3390/e17041936193619451099-43002015-04-01doi: 10.3390/e17041936Lingfeng LiuSuoxia MiaoBocheng Liu<![CDATA[Entropy, Vol. 17, Pages 1916-1935: Pressure Tensor of Nanoscopic Liquid Drops]]>
http://mdpi.com/1099-4300/17/4/1916
This study describes the structure of an inhomogeneous fluid of one or several components that forms a spherical interface. Using the stress tensor of Percus–Romero, which depends on the density of one particle and the intermolecular potential, it provides an analytical development leading to the microscopic expressions of the pressure differences and the interfacial properties of both systems. The results are compared with a previous study and agree with the description of the mean field.Entropy2015-04-01174Article10.3390/e17041916191619351099-43002015-04-01doi: 10.3390/e17041916José G. Segovia-LópezAdrian Carbajal-Domínguez<![CDATA[Entropy, Vol. 17, Pages 1896-1915: Kinetic Theory Modeling and Efficient Numerical Simulation of Gene Regulatory Networks Based on Qualitative Descriptions]]>
http://mdpi.com/1099-4300/17/4/1896
In this work, we begin by considering the qualitative modeling of biological regulatory systems using process hitting, from which we define its probabilistic counterpart by considering the chemical master equation within a kinetic theory framework. The last equation is efficiently solved by considering a separated representation within the proper generalized decomposition framework that allows circumventing the so-called curse of dimensionality. Finally, model parameters can be added as extra-coordinates in order to obtain a parametric solution of the model.Entropy2015-04-01174Article10.3390/e17041896189619151099-43002015-04-01doi: 10.3390/e17041896Francisco ChinestaMorgan MagninOlivier RouxAmine AmmarElias Cueto<![CDATA[Entropy, Vol. 17, Pages 1882-1895: Statistical Correlations of the N-particle Moshinsky Model]]>
http://mdpi.com/1099-4300/17/4/1882
We study the correlation of the ground state of an N-particle Moshinsky model by computing the Shannon entropy in both position and momentum spaces. We have derived the Shannon entropy and mutual information with analytical forms of such an N-particle Moshinsky model, and this helps us test the entropic uncertainty principle. The Shannon entropy in position space decreases as interaction strength increases. However, Shannon entropy in momentum space has the opposite trend. Shannon entropy of the whole system satisfies the equality of entropic uncertainty principle. Our results also indicate that, independent of the sizes of the two subsystems, the mutual information increases monotonically as the interaction strength increases.Entropy2015-03-31174Article10.3390/e17041882188218951099-43002015-03-31doi: 10.3390/e17041882Hsuan PengYew Ho<![CDATA[Entropy, Vol. 17, Pages 1850-1881: Computing Bi-Invariant Pseudo-Metrics on Lie Groups for Consistent Statistics]]>
http://mdpi.com/1099-4300/17/4/1850
In computational anatomy, organ’s shapes are often modeled as deformations of a reference shape, i.e., as elements of a Lie group. To analyze the variability of the human anatomy in this framework, we need to perform statistics on Lie groups. A Lie group is a manifold with a consistent group structure. Statistics on Riemannian manifolds have been well studied, but to use the statistical Riemannian framework on Lie groups, one needs to define a Riemannian metric compatible with the group structure: a bi-invariant metric. However, it is known that Lie groups, which are not a direct product of compact and abelian groups, have no bi-invariant metric. However, what about bi-invariant pseudo-metrics? In other words: could we remove the assumption of the positivity of the metric and obtain consistent statistics on Lie groups through the pseudo-Riemannian framework? Our contribution is two-fold. First, we present an algorithm that constructs bi-invariant pseudo-metrics on a given Lie group, in the case of existence. Then, by running the algorithm on commonly-used Lie groups, we show that most of them do not admit any bi-invariant (pseudo-) metric. We thus conclude that the (pseudo-) Riemannian setting is too limited for the definition of consistent statistics on general Lie groups.Entropy2015-03-31174Article10.3390/e17041850185018811099-43002015-03-31doi: 10.3390/e17041850Nina MiolaneXavier Pennec<![CDATA[Entropy, Vol. 17, Pages 1814-1849: Geometry of Fisher Information Metric and the Barycenter Map]]>
http://mdpi.com/1099-4300/17/4/1814
Geometry of Fisher metric and geodesics on a space of probability measures defined on a compact manifold is discussed and is applied to geometry of a barycenter map associated with Busemann function on an Hadamard manifold \(X\). We obtain an explicit formula of geodesic and then several theorems on geodesics, one of which asserts that any two probability measures can be joined by a unique geodesic. Using Fisher metric and thus obtained properties of geodesics, a fibre space structure of barycenter map and geodesical properties of each fibre are discussed. Moreover, an isometry problem on an Hadamard manifold \(X\) and its ideal boundary \(\partial X\)—for a given homeomorphism \(\Phi\) of \(\partial X\) find an isometry of \(X\) whose \(\partial X\)-extension coincides with \(\Phi\)—is investigated in terms of the barycenter map.Entropy2015-03-30174Article10.3390/e17041814181418491099-43002015-03-30doi: 10.3390/e17041814Mitsuhiro ItohHiroyasu Satoh<![CDATA[Entropy, Vol. 17, Pages 1795-1813: Preclinical Diagnosis of Magnetic Resonance (MR) Brain Images via Discrete Wavelet Packet Transform with Tsallis Entropy and Generalized Eigenvalue Proximal Support Vector Machine (GEPSVM)]]>
http://mdpi.com/1099-4300/17/4/1795
Background: Developing an accurate computer-aided diagnosis (CAD) system of MR brain images is essential for medical interpretation and analysis. In this study, we propose a novel automatic CAD system to distinguish abnormal brains from normal brains in MRI scanning. Methods: The proposed method simplifies the task to a binary classification problem. We used discrete wavelet packet transform (DWPT) to extract wavelet packet coefficients from MR brain images. Next, Shannon entropy (SE) and Tsallis entropy (TE) were harnessed to obtain entropy features from DWPT coefficients. Finally, generalized eigenvalue proximate support vector machine (GEPSVM), and GEPSVM with radial basis function (RBF) kernel, were employed as classifier. We tested the four proposed diagnosis methods (DWPT + SE + GEPSVM, DWPT + TE + GEPSVM, DWPT + SE + GEPSVM + RBF, and DWPT + TE + GEPSVM + RBF) on three benchmark datasets of Dataset-66, Dataset-160, and Dataset-255. Results: The 10 repetition of K-fold stratified cross validation results showed the proposed DWPT + TE + GEPSVM + RBF method excelled not only other three proposed classifiers but also existing state-of-the-art methods in terms of classification accuracy. In addition, the DWPT + TE + GEPSVM + RBF method achieved accuracy of 100%, 100%, and 99.53% on Dataset-66, Dataset-160, and Dataset-255, respectively. For Dataset-255, the offline learning cost 8.4430s and online prediction cost merely 0.1059s. Conclusions: We have proved the effectiveness of the proposed method, which achieved nearly 100% accuracy over three benchmark datasets.Entropy2015-03-30174Article10.3390/e17041795179518131099-43002015-03-30doi: 10.3390/e17041795Yudong ZhangZhengchao DongShuihua WangGenlin JiJiquan Yang<![CDATA[Entropy, Vol. 17, Pages 1775-1794: Multidimensional Scaling Visualization Using Parametric Similarity Indices]]>
http://mdpi.com/1099-4300/17/4/1775
In this paper, we apply multidimensional scaling (MDS) and parametric similarity indices (PSI) in the analysis of complex systems (CS). Each CS is viewed as a dynamical system, exhibiting an output time-series to be interpreted as a manifestation of its behavior. We start by adopting a sliding window to sample the original data into several consecutive time periods. Second, we define a given PSI for tracking pieces of data. We then compare the windows for different values of the parameter, and we generate the corresponding MDS maps of ‘points’. Third, we use Procrustes analysis to linearly transform the MDS charts for maximum superposition and to build a globalMDS map of “shapes”. This final plot captures the time evolution of the phenomena and is sensitive to the PSI adopted. The generalized correlation, theMinkowski distance and four entropy-based indices are tested. The proposed approach is applied to the Dow Jones Industrial Average stock market index and the Europe Brent Spot Price FOB time-series.Entropy2015-03-30174Article10.3390/e17041775177517941099-43002015-03-30doi: 10.3390/e17041775J. Tenreiro MachadoAntónio LopesAlexandra Galhano<![CDATA[Entropy, Vol. 17, Pages 1755-1774: Generalized Remote Preparation of Arbitrary m-qubit Entangled States via Genuine Entanglements]]>
http://mdpi.com/1099-4300/17/4/1755
Herein, we present a feasible, general protocol for quantum communication within a network via generalized remote preparation of an arbitrary m-qubit entangled state designed with genuine tripartite Greenberger–Horne–Zeilinger-type entangled resources. During the implementations, we construct novel collective unitary operations; these operations are tasked with performing the necessary phase transfers during remote state preparations. We have distilled our implementation methods into a five-step procedure, which can be used to faithfully recover the desired state during transfer. Compared to previous existing schemes, our methodology features a greatly increased success probability. After the consumption of auxiliary qubits and the performance of collective unitary operations, the probability of successful state transfer is increased four-fold and eight-fold for arbitrary two- and three-qubit entanglements when compared to other methods within the literature, respectively. We conclude this paper with a discussion of the presented scheme for state preparation, including: success probabilities, reducibility and generalizability.Entropy2015-03-30174Article10.3390/e17041755175517741099-43002015-03-30doi: 10.3390/e17041755Dong WangRoss HoehnLiu YeSabre Kais<![CDATA[Entropy, Vol. 17, Pages 1734-1754: Research on the Stability of Open Financial System]]>
http://mdpi.com/1099-4300/17/4/1734
We propose a new herd mechanism and embed it into an open financial market system, which allows traders to get in and out of the system based on some transition rates. Moreover, the novel mechanism can avoid the volatility disappearance when the population scale increases. There are three kinds of heterogeneous agents in the system: optimistic, pessimistic and fundamental. Interactions especially occur among three different groups of agents instead of two, which makes the artificial financial market more close to the real one. By the simulation results of this complex system, we can explain stylized facts like volatility clustering and find the key parameters of market bubbles and market collapses.Entropy2015-03-27174Article10.3390/e17041734173417541099-43002015-03-27doi: 10.3390/e17041734Haijun YangLin LiDeshen Wang<![CDATA[Entropy, Vol. 17, Pages 1701-1733: Synchronicity from Synchronized Chaos]]>
http://mdpi.com/1099-4300/17/4/1701
The synchronization of loosely-coupled chaotic oscillators, a phenomenon investigated intensively for the last two decades, may realize the philosophical concept of “synchronicity”—the commonplace notion that related eventsmysteriously occur at the same time. When extended to continuous media and/or large discrete arrays, and when general (non-identical) correspondences are considered between states, intermittent synchronous relationships indeed become ubiquitous. Meaningful synchronicity follows naturally if meaningful events are identified with coherent structures, defined by internal synchronization between remote degrees of freedom; a condition that has been posited as necessary for synchronizability with an external system. The important case of synchronization between mind and matter is realized if mind is analogized to a computer model, synchronizing with a sporadically observed system, as in meteorological data assimilation. Evidence for the ubiquity of synchronization is reviewed along with recent proposals that: (1) synchronization of different models of the same objective process may be an expeditious route to improved computational modeling and may also describe the functioning of conscious brains; and (2) the nonlocality in quantum phenomena implied by Bell’s theorem may be explained in a variety of deterministic (hidden variable) interpretations if the quantum world resides on a generalized synchronization “manifold”.Entropy2015-03-27174Article10.3390/e17041701170117331099-43002015-03-27doi: 10.3390/e17041701Gregory Duane<![CDATA[Entropy, Vol. 17, Pages 1690-1700: Maximum Entropy and Probability Kinematics Constrained by Conditionals]]>
http://mdpi.com/1099-4300/17/4/1690
Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (PME) give a solution to the obverse Majerník problem; and (2) isWagner correct when he claims that Jeffrey’s updating principle (JUP) contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.Entropy2015-03-27174Article10.3390/e17041690169017001099-43002015-03-27doi: 10.3390/e17041690Stefan Lukits<![CDATA[Entropy, Vol. 17, Pages 1673-1689: Analysis of Data Complexity in Human DNA for Gene-Containing Zone Prediction]]>
http://mdpi.com/1099-4300/17/4/1673
This study delves further into the analysis of genomic data by computing a variety of complexity measures. We analyze the effect of window size and evaluate the precision and recall of the prediction of gene zones, aided with a much larger dataset (full chromosomes). A technique based on the separation of two cases (gene-containing and non-gene-containing) has been developed as a basic gene predictor for automated DNA analysis. This predictor was tested on various sequences of human DNA obtained from public databases, in a set of three experiments. The first one covers window size and other parameters; the second one corresponds to an analysis of a full human chromosome (198 million nucleic acids); and the last one tests subject variability (with five different individual subjects). All three experiments have high-quality results, in terms of recall and precision, thus indicating the effectiveness of the predictor.Entropy2015-03-27174Article10.3390/e17041673167316891099-43002015-03-27doi: 10.3390/e17041673Ricardo MongeJuan Crespo<![CDATA[Entropy, Vol. 17, Pages 1660-1672: Evolutionary Voluntary Prisoner’s Dilemma Game under Deterministic and Stochastic Dynamics]]>
http://mdpi.com/1099-4300/17/4/1660
The voluntary prisoner’s dilemma (VPD) game has sparked interest from various fields since it was proposed as an effective mechanism to incentivize cooperative behavior. Current studies show that the inherent cyclic dominance of the strategies of the VPD game results in periodic oscillations in population. This paper investigated the influence of the level of individual rationality and the size of a population on the evolutionary dynamics of the VPD game. Different deterministic dynamics, such as the replicator dynamic, the Smith dynamic, the Brown-von Neumann-Nash (BNN) dynamic and the best response (BR) dynamic, for the evolutionary VPD game were modeled and simulated. The stochastic evolutionary dynamics based on quasi birth and death (QBD) process was proposed for the evolutionary VPD game and compared with deterministic dynamics. The results indicated that with the increase of the loners’ fixed payoff, the loner is more likely to remain in the stable state of a VPD game under any of the dynamics mentioned above. However, the different speeds of motion under the dynamics in the cycle dominance proved to be diverse under different evolutionary dynamics and also highly sensitive to the rationality of individuals in a population. Furthermore, in QBD stochastic dynamics, the size of the population has a remarkable effect on the possibility distribution. When the population size increases, the limited distribution of the QBD process will be in accordance with the results in the deterministic dynamics.Entropy2015-03-26174Article10.3390/e17041660166016721099-43002015-03-26doi: 10.3390/e17041660Qian YuRan ChenXiaoyan Wen<![CDATA[Entropy, Vol. 17, Pages 1634-1659: Quantum Discord and Information Deficit in Spin Chains]]>
http://mdpi.com/1099-4300/17/4/1634
We examine the behavior of quantum correlations of spin pairs in a finite anisotropic XY spin chain immersed in a transverse magnetic field, through the analysis of the quantum discord and the conventional and quadratic one-way information deficits. We first provide a brief review of these measures, showing that the last ones can be obtained as particular cases of a generalized information deficit based on general entropic forms. All of these measures coincide with an entanglement entropy in the case of pure states, but can be non-zero in separable mixed states, vanishing just for classically correlated states. It is then shown that their behavior in the exact ground state of the chain exhibits similar features, deviating significantly from that of the pair entanglement below the critical field. In contrast with entanglement, they reach full range in this region, becoming independent of the pair separation and coupling range in the immediate vicinity of the factorizing field. It is also shown, however, that significant differences between the quantum discord and the information deficits arise in the local minimizing measurement that defines them. Both analytical and numerical results are provided.Entropy2015-03-26174Article10.3390/e17041634163416591099-43002015-03-26doi: 10.3390/e17041634Norma CanosaLeonardo CilibertiRaúl Rossignoli<![CDATA[Entropy, Vol. 17, Pages 1606-1633: A Fundamental Scale of Descriptions for Analyzing Information Content of Communication Systems]]>
http://mdpi.com/1099-4300/17/4/1606
The complexity of the description of a system is a function of the entropy of its symbolic description. Prior to computing the entropy of the system’s description, an observation scale has to be assumed. In texts written in artificial and natural languages, typical scales are binary, characters, and words. However, considering languages as structures built around certain preconceived set of symbols, like words or characters, limits the level of complexity that can be revealed analytically. This study introduces the notion of the fundamental description scale to analyze the essence of the structure of a language. The concept of Fundamental Scale is tested for English and musical instrument digital interface (MIDI) music texts using an algorithm developed to split a text in a collection of sets of symbols that minimizes the observed entropy of the system. This Fundamental Scale reflects more details of the complexity of the language than using bits, characters or words. Results show that this Fundamental Scale allows to compare completely different languages, such as English and MIDI coded music regarding its structural entropy. This comparative power facilitates the study of the complexity of the structure of different communication systems.Entropy2015-03-25174Article10.3390/e17041606160616331099-43002015-03-25doi: 10.3390/e17041606Gerardo FebresKlaus Jaffe<![CDATA[Entropy, Vol. 17, Pages 1581-1605: Kählerian Information Geometry for Signal Processing]]>
http://mdpi.com/1099-4300/17/4/1581
We prove the correspondence between the information geometry of a signal filter and a Kähler manifold. The information geometry of a minimum-phase linear system with a finite complex cepstrum norm is a Kähler manifold. The square of the complex cepstrum norm of the signal filter corresponds to the Kähler potential. The Hermitian structure of the Kähler manifold is explicitly emergent if and only if the impulse response function of the highest degree in z is constant in model parameters. The Kählerian information geometry takes advantage of more efficient calculation steps for the metric tensor and the Ricci tensor. Moreover, α-generalization on the geometric tensors is linear in α . It is also robust to find Bayesian predictive priors, such as superharmonic priors, because Laplace–Beltrami operators on Kähler manifolds are in much simpler forms than those of the non-Kähler manifolds. Several time series models are studied in the Kählerian information geometry.Entropy2015-03-25174Article10.3390/e17041581158116051099-43002015-03-25doi: 10.3390/e17041581Jaehyung ChoiAndrew Mullhaupt<![CDATA[Entropy, Vol. 17, Pages 1558-1580: High Recharge Areas in the Choushui River Alluvial Fan (Taiwan) Assessed from Recharge Potential Analysis and Average Storage Variation Indexes]]>
http://mdpi.com/1099-4300/17/4/1558
High recharge areas significantly influence the groundwater quality and quantity in regional groundwater systems. Many studies have applied recharge potential analysis (RPA) to estimate groundwater recharge potential (GRP) and have delineated high recharge areas based on the estimated GRP. However, most of these studies define the RPA parameters with supposition, and this represents a major source of uncertainty for applying RPA. To objectively define the RPA parameter values without supposition, this study proposes a systematic method based on the theory of parameter identification. A surrogate variable, namely the average storage variation (ASV) index, is developed to calibrate the RPA parameters, because of the lack of direct GRP observations. The study results show that the correlations between the ASV indexes and computed GRP values improved from 0.67 before calibration to 0.85 after calibration, thus indicating that the calibrated RPA parameters represent the recharge characteristics of the study area well; these data also highlight how defining the RPA parameters with ASV indexes can help to improve the accuracy. The calibrated RPA parameters were used to estimate the GRP distribution of the study area, and the GRP values were graded into five levels. High and excellent level areas are defined as high recharge areas, which composed 7.92% of the study area. Overall, this study demonstrates that the developed approach can objectively define the RPA parameters and high recharge areas of the Choushui River alluvial fan, and the results should serve as valuable references for the Taiwanese government in their efforts to conserve the groundwater quality and quantity of the study area.Entropy2015-03-24174Article10.3390/e17041558155815801099-43002015-03-24doi: 10.3390/e17041558Jui-Pin TsaiYu-Wen ChenLiang-Cheng ChangYi-Ming KuoYu-Hsuan TuChen-Che Pan<![CDATA[Entropy, Vol. 17, Pages 1549-1557: Thermodynamics in Curved Space-Time and Its Application to Holography]]>
http://mdpi.com/1099-4300/17/4/1549
The thermodynamic behaviors of a system living in a curved space-time are different from those of a system in a flat space-time. We have investigated the thermodynamics for a system consisting of relativistic massless bosons. We show that a strongly curved metric will produce a large enhancement of the degrees of freedom in the formulae of energy and entropy of the system, as a comparison to the case in a flat space-time. We are mainly concerned with its implications to holography, including the derivations of holographic entropy and holographic screen.Entropy2015-03-24174Article10.3390/e17041549154915571099-43002015-03-24doi: 10.3390/e17041549Yong XiaoLi-Hua FengLi Guan<![CDATA[Entropy, Vol. 17, Pages 1535-1548: Clustering Heterogeneous Data with k-Means by Mutual Information-Based Unsupervised Feature Transformation]]>
http://mdpi.com/1099-4300/17/3/1535
Traditional centroid-based clustering algorithms for heterogeneous data with numerical and non-numerical features result in different levels of inaccurate clustering. This is because the Hamming distance used for dissimilarity measurement of non-numerical values does not provide optimal distances between different values, and problems arise from attempts to combine the Euclidean distance and Hamming distance. In this study, the mutual information (MI)-based unsupervised feature transformation (UFT), which can transform non-numerical features into numerical features without information loss, was utilized with the conventional k-means algorithm for heterogeneous data clustering. For the original non-numerical features, UFT can provide numerical values which preserve the structure of the original non-numerical features and have the property of continuous values at the same time. Experiments and analysis of real-world datasets showed that, the integrated UFT-k-means clustering algorithm outperformed others for heterogeneous data with both numerical and non-numerical features.Entropy2015-03-23173Article10.3390/e17031535153515481099-43002015-03-23doi: 10.3390/e17031535Min WeiTommy ChowRosa Chan<![CDATA[Entropy, Vol. 17, Pages 1508-1534: Space-Time Quantum Imaging]]>
http://mdpi.com/1099-4300/17/3/1508
We report on an experimental and theoretical investigation of quantum imaging where the images are stored in both space and time. Ghost images of remote objects are produced with either one or two beams of chaotic laser light generated by a rotating ground glass and two sensors measuring the reference field and bucket field at different space-time points. We further observe that the ghost images translate depending on the time delay between the sensor measurements. The ghost imaging experiments are performed both with and without turbulence. A discussion of the physics of the space-time imaging is presented in terms of quantum nonlocal two-photon analysis to support the experimental results. The theoretical model includes certain phase factors of the rotating ground glass. These experiments demonstrated a means to investigate the time and space aspects of ghost imaging and showed that ghost imaging contains more information per measured photon than was previously recognized where multiple ghost images are stored within the same ghost imaging data sets. This suggests new pathways to explore quantum information stored not only in multi-photon coincidence information but also in time delayed multi-photon interference. The research is applicable to making enhanced space-time quantum images and videos of moving objects where the images are stored in both space and time.Entropy2015-03-23173Article10.3390/e17031508150815341099-43002015-03-23doi: 10.3390/e17031508Ronald MeyersKeith Deacon<![CDATA[Entropy, Vol. 17, Pages 1477-1507: Application of Divergence Entropy to Characterize the Structure of the Hydrophobic Core in DNA Interacting Proteins]]>
http://mdpi.com/1099-4300/17/3/1477
The fuzzy oil drop model, a tool which can be used to study the structure of the hydrophobic core in proteins, has been applied in the analysis of proteins belonging to the jumonji group—JARID2, JARID1A, JARID1B and JARID1D—proteins that share the property of being able to interact with DNA. Their ARID and PHD domains, when analyzed in the context of the fuzzy oil drop model, are found to exhibit structural variability regarding the status of their secondary folds, including the β-hairpin which determines their biological function. Additionally, the structure of disordered fragments which are present in jumonji proteins (as confirmed by the DisProt database) is explained on the grounds of the hydrophobic core model, suggesting that such fragments contribute to tertiary structural stabilization. This conclusion is supported by divergence entropy measurements, expressing the degree of ordering in each protein’s hydrophobic core.Entropy2015-03-23173Article10.3390/e17031477147715071099-43002015-03-23doi: 10.3390/e17031477Barbara KalinowskaMateusz BanachLeszek KoniecznyIrena Roterman<![CDATA[Entropy, Vol. 17, Pages 1466-1476: The Solute-Exclusion Zone: A Promising Application for Mirofluidics]]>
http://mdpi.com/1099-4300/17/3/1466
While unique phenomena exist at fluid-solid phase intersections, many interfacial phenomena manifest solely on limited scales—i.e., the nm-mm ranges—which stifles their application potential. Here, we constructed microfluidic chips that utilize the unique long-distance interface effects of the Solute-Exclusion Zone (EZ) phenomenon to mix, separate, and guide samples in desired directions within microfluidic channels. On our “EZ Chip”, we utilized the interfacial force generated by EZs to transport specimens across streamlines without the need of an off-chip power source. The advantages of easy-integration, low fabrication cost, and no off-chip energy input make the EZ suitable for independent, portable lab-on-chip system applications.Entropy2015-03-23173Article10.3390/e17031466146614761099-43002015-03-23doi: 10.3390/e17031466Chi-Shuo ChenErik FarrJesse AnayaEric ChenWei-Chun Chin