entropy-logo

Journal Browser

Journal Browser

Concepts of Entropy and Their Applications

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 March 2012) | Viewed by 171633

Special Issue Editors

Special Issue Information

Dear Colleagues,

The original papers in this Special Issue arose from a meeting of the AMSI-MASCOS Theme Program, Concepts of Entropy and their Applications, which took place at Melbourne Australia, November - December, 2007. During the four years that have elapsed since that meeting, research on this topic has progressed significantly.

Although there are various concepts of entropy, they provide unifying insights in a wide variety of both topics and scale. Topics include information theory, forecasting, optimization, irreversible processes, thermodynamics and statistical mechanics, while the scale ranges from the atomic, through the macroscopic to the cosmological.

We feel that it is a good time to call for new contributions from all potential authors. We hope that some of the original authors who had an association with the theme program will provide updates on progress. Other readers of the original issue may be stimulated to provide complementary or supporting material of their own. Finally we welcome new authors who believe that their own work relates well to the overall theme.

Prof. Dr. Philip Broadbridge
Prof. Dr. Tony Guttmann
Guest Editors

Keywords

  • Clausius entropy
  • Gibbs entropy
  • Boltzmann entropy
  • von Neumann entropy
  • Tsallis entropy
  • entropy and information theory
  • black hole entropy
  • entropy in environmental modelling

Related Special Issues

Published Papers (22 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

524 KiB  
Article
The Extension of Statistical Entropy Analysis to Chemical Compounds
by Alicja P. Sobańtka, Matthias Zessner and Helmut Rechberger
Entropy 2012, 14(12), 2413-2426; https://doi.org/10.3390/e14122413 - 28 Nov 2012
Cited by 19 | Viewed by 6185
Abstract
Statistical entropy analysis (SEA) quantifies the dilution and concentration of conservative substances (e.g., heavy metals) in a system. In this paper, the SEA concept is extended (eSEA) to make it applicable to systems in which the chemical speciation is of particular importance. The [...] Read more.
Statistical entropy analysis (SEA) quantifies the dilution and concentration of conservative substances (e.g., heavy metals) in a system. In this paper, the SEA concept is extended (eSEA) to make it applicable to systems in which the chemical speciation is of particular importance. The eSEA is applied to a simplified region used for crop farming. The extent to which the region concentrates or dilutes nitrogen compounds is expressed as the change in statistical entropy (DH). A detailed derivation for the calculation of DH is provided. The results are discussed for four variations of the crop farming system, showing that the efficiency of crop farming can be expressed by eSEA. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

365 KiB  
Article
Permutation Entropy and Its Main Biomedical and Econophysics Applications: A Review
by Massimiliano Zanin, Luciano Zunino, Osvaldo A. Rosso and David Papo
Entropy 2012, 14(8), 1553-1577; https://doi.org/10.3390/e14081553 - 23 Aug 2012
Cited by 485 | Viewed by 25300
Abstract
Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, [...] Read more.
Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, an aspect which is not usually taken into account. The idea of calculating entropy based on permutation patterns (that is, permutations defined by the order relations among values of a time series) has received a lot of attention in the last years, especially for the understanding of complex and chaotic systems. Permutation entropy directly accounts for the temporal information contained in the time series; furthermore, it has the quality of simplicity, robustness and very low computational cost. To celebrate the tenth anniversary of the original work, here we analyze the theoretical foundations of the permutation entropy, as well as the main recent applications to the analysis of economical markets and to the understanding of biomedical systems. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

649 KiB  
Article
A New Entropy Optimization Model for Graduation of Data in Survival Analysis
by Dayi He, Qi Huang and Jianwei Gao
Entropy 2012, 14(8), 1306-1316; https://doi.org/10.3390/e14081306 - 25 Jul 2012
Cited by 5 | Viewed by 5724
Abstract
Graduation of data is of great importance in survival analysis. Smoothness and goodness of fit are two fundamental requirements in graduation. Based on the instinctive defining expression for entropy in terms of a probability distribution, two optimization models based on the Maximum Entropy [...] Read more.
Graduation of data is of great importance in survival analysis. Smoothness and goodness of fit are two fundamental requirements in graduation. Based on the instinctive defining expression for entropy in terms of a probability distribution, two optimization models based on the Maximum Entropy Principle (MaxEnt) and Minimum Cross Entropy Principle (MinCEnt) to estimate mortality probability distributions are presented. The results demonstrate that the two approaches achieve the two basic requirements of data graduating, smoothness and goodness of fit respectively. Then, in order to achieve a compromise between these requirements, a new entropy optimization model is proposed by defining a hybrid objective function combining both principles of MaxEnt and MinCEnt models linked by a given adjustment factor which reflects the preference of smoothness and goodness of fit in the data graduation. The proposed approach is feasible and more reasonable in data graduation when both smoothness and goodness of fit are concerned. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

1402 KiB  
Article
Socio-Thermodynamics—Evolutionary Potentials in a Population of Hawks and Doves
by Ingo Müller
Entropy 2012, 14(7), 1285-1295; https://doi.org/10.3390/e14071285 - 23 Jul 2012
Cited by 2 | Viewed by 5387
Abstract
The socio-thermodynamics of a population of two competing species exhibits strong analogies with the thermodynamics of solutions and alloys of two constituents. In particular we may construct strategy diagrams akin to the phase diagrams of chemical thermodynamics, complete with regions of homogeneous mixing [...] Read more.
The socio-thermodynamics of a population of two competing species exhibits strong analogies with the thermodynamics of solutions and alloys of two constituents. In particular we may construct strategy diagrams akin to the phase diagrams of chemical thermodynamics, complete with regions of homogeneous mixing and miscibility gaps. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

123 KiB  
Article
Quantum Dynamical Entropies and Gács Algorithmic Entropy
by Fabio Benatti
Entropy 2012, 14(7), 1259-1273; https://doi.org/10.3390/e14071259 - 12 Jul 2012
Cited by 2 | Viewed by 5205
Abstract
Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical) entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step [...] Read more.
Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical) entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step along typical trajectories of a classical ergodic system equals the KS-entropy. In the following, we establish a similar relation between the Connes–Narnhofer–Thirring quantum dynamical entropy for the shift on quantum spin chains and the Gács algorithmic entropy. We further provide, for the same system, a weaker linkage between the latter algorithmic complexity and a different quantum dynamical entropy proposed by Alicki and Fannes. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
289 KiB  
Article
Nonparametric Estimation of Information-Based Measures of Statistical Dispersion
by Lubomir Kostal and Ondrej Pokora
Entropy 2012, 14(7), 1221-1233; https://doi.org/10.3390/e14071221 - 10 Jul 2012
Cited by 9 | Viewed by 7072
Abstract
We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable [...] Read more.
We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

279 KiB  
Article
Multivariate Multi-Scale Permutation Entropy for Complexity Analysis of Alzheimer’s Disease EEG
by Francesco Carlo Morabito, Domenico Labate, Fabio La Foresta, Alessia Bramanti, Giuseppe Morabito and Isabella Palamara
Entropy 2012, 14(7), 1186-1202; https://doi.org/10.3390/e14071186 - 04 Jul 2012
Cited by 215 | Viewed by 15718
Abstract
An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation [...] Read more.
An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation Entropy, a methodology for time series processing based on ordinal analysis. Permutation Entropy is conceptually simple, structurally robust to noise and artifacts, computationally very fast, which is relevant for designing portable diagnostics. Since time series derived from biological systems show structures on multiple spatial-temporal scales, the proposed technique can be useful for other types of biomedical signal analysis. In this work, the possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

130 KiB  
Article
Fourth Order Diffusion Equations with Increasing Entropy
by Naghmana Tehseen and Philip Broadbridge
Entropy 2012, 14(7), 1127-1139; https://doi.org/10.3390/e14071127 - 25 Jun 2012
Cited by 5 | Viewed by 6783
Abstract
The general quasi-linear autonomous fourth order diffusion equation ut = −[G(u)uxxx + h(u, ux, uxx)]x with positive variable diffusivity G(u) and lower-order flux component h is considered on the real line. A direct algorithm produces a [...] Read more.
The general quasi-linear autonomous fourth order diffusion equation ut = −[G(u)uxxx + h(u, ux, uxx)]x with positive variable diffusivity G(u) and lower-order flux component h is considered on the real line. A direct algorithm produces a general class of equations for which the Shannon entropy density obeys a reaction-diffusion equation with a positive irreducible source term. Such equations may have any positive twice-differentiable diffusivity function G(u). The forms of such equations are the indicators of more general conservation equations whose entropy equation may be expressed in an alternative reaction-diffusion form whose source term, although reducible, is positive. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

663 KiB  
Article
Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
by Carlos A. L. Pires and Rui A. P. Perdigão
Entropy 2012, 14(6), 1103-1126; https://doi.org/10.3390/e14061103 - 19 Jun 2012
Cited by 14 | Viewed by 7405
Abstract
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of [...] Read more.
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

6645 KiB  
Article
Deterministic Thermal Reservoirs
by Gary P. Morriss and Daniel Truant
Entropy 2012, 14(6), 1011-1027; https://doi.org/10.3390/e14061011 - 08 Jun 2012
Cited by 6 | Viewed by 5198
Abstract
We explore the consequences of a deterministic microscopic thermostat-reservoir contact mechanism for hard disks where the collision rule at the boundary is modified. Numerical evidence and theoretical argument is given that suggests that an energy balance is achieved for a system of hard [...] Read more.
We explore the consequences of a deterministic microscopic thermostat-reservoir contact mechanism for hard disks where the collision rule at the boundary is modified. Numerical evidence and theoretical argument is given that suggests that an energy balance is achieved for a system of hard disks in contact with two reservoirs at equal temperatures. This system however produces entropy near the the system-reservoir boundaries and this entropy flows into the two reservoirs. Thus rather than producing an equilibrium state, the system is at a steady state with a steady entropy flow without any associated energy flux. The microscopic mechanisms associated with energy and entropy fluxes for this system are examined in detail. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

471 KiB  
Article
Adaptive Computation of Multiscale Entropy and Its Application in EEG Signals for Monitoring Depth of Anesthesia During Surgery
by Quan Liu, Qin Wei, Shou-Zen Fan, Cheng-Wei Lu, Tzu-Yu Lin, Maysam F. Abbod and Jiann-Shing Shieh
Entropy 2012, 14(6), 978-992; https://doi.org/10.3390/e14060978 - 25 May 2012
Cited by 44 | Viewed by 9539
Abstract
Entropy as an estimate of complexity of the electroencephalogram is an effective parameter for monitoring the depth of anesthesia (DOA) during surgery. Multiscale entropy (MSE) is useful to evaluate the complexity of signals over different time scales. However, the limitation of the length [...] Read more.
Entropy as an estimate of complexity of the electroencephalogram is an effective parameter for monitoring the depth of anesthesia (DOA) during surgery. Multiscale entropy (MSE) is useful to evaluate the complexity of signals over different time scales. However, the limitation of the length of processed signal is a problem due to observing the variation of sample entropy (SE) on different scales. In this study, the adaptive resampling procedure is employed to replace the process of coarse-graining in MSE. According to the analysis of various signals and practical EEG signals, it is feasible to calculate the SE from the adaptive resampled signals, and it has the highly similar results with the original MSE at small scales. The distribution of the MSE of EEG during the whole surgery based on adaptive resampling process is able to show the detailed variation of SE in small scales and complexity of EEG, which could help anesthesiologists evaluate the status of patients. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Graphical abstract

1638 KiB  
Article
Entropy Concept for Paramacrosystems with Complex States
by Yuri S. Popkov
Entropy 2012, 14(5), 924-944; https://doi.org/10.3390/e14050924 - 10 May 2012
Cited by 1 | Viewed by 4792
Abstract
Consideration is given to macrosystems called paramacrosystems with states of finite capacity and distinguishable and undistinguishable elements with stochastic behavior. The paramacrosystems fill a gap between Fermi and Einstein macrosystems. Using the method of the generating functions, we have obtained expressions for probabilistic [...] Read more.
Consideration is given to macrosystems called paramacrosystems with states of finite capacity and distinguishable and undistinguishable elements with stochastic behavior. The paramacrosystems fill a gap between Fermi and Einstein macrosystems. Using the method of the generating functions, we have obtained expressions for probabilistic characteristics (distribution of the macrostate probabilities, physical and information entropies) of the paramacrosystems. The cases with equal and unequal prior probabilities for elements to occupy the states with finite capacities are considered. The unequal prior probabilities influence the morphological properties of the entropy functions and the functions of the macrostate probabilities, transforming them in the multimodal functions. The examples of the paramacrosystems with two-modal functions of the entropy and distribution of the macrostate probabilities are presented. The variation principle does not work for such cases. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

260 KiB  
Article
An Entropic Estimator for Linear Inverse Problems
by Amos Golan and Henryk Gzyl
Entropy 2012, 14(5), 892-923; https://doi.org/10.3390/e14050892 - 10 May 2012
Cited by 5 | Viewed by 6767
Abstract
In this paper we examine an Information-Theoretic method for solving noisy linear inverse estimation problems which encompasses under a single framework a whole class of estimation methods. Under this framework, the prior information about the unknown parameters (when such information exists), and constraints [...] Read more.
In this paper we examine an Information-Theoretic method for solving noisy linear inverse estimation problems which encompasses under a single framework a whole class of estimation methods. Under this framework, the prior information about the unknown parameters (when such information exists), and constraints on the parameters can be incorporated in the statement of the problem. The method builds on the basics of the maximum entropy principle and consists of transforming the original problem into an estimation of a probability density on an appropriate space naturally associated with the statement of the problem. This estimation method is generic in the sense that it provides a framework for analyzing non-normal models, it is easy to implement and is suitable for all types of inverse problems such as small and or ill-conditioned, noisy data. First order approximation, large sample properties and convergence in distribution are developed as well. Analytical examples, statistics for model comparisons and evaluations, that are inherent to this method, are discussed and complemented with explicit examples. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
207 KiB  
Article
Second Law Constraints on the Dynamics of a Mixture of Two Fluids at Different Temperatures
by A. D. Kirwan, Jr.
Entropy 2012, 14(5), 880-891; https://doi.org/10.3390/e14050880 - 09 May 2012
Cited by 4 | Viewed by 4572
Abstract
Constitutive laws for multi-component fluids (MCF) is one of the thorniest problems in science. Two questions explored here are: how to ensure that these relations reduce to accepted forms when all but one of the constituents vanishes; and what constraints does the Second [...] Read more.
Constitutive laws for multi-component fluids (MCF) is one of the thorniest problems in science. Two questions explored here are: how to ensure that these relations reduce to accepted forms when all but one of the constituents vanishes; and what constraints does the Second Law impose on the dynamics of viscous fluids at different temperatures? The analysis suggests an alternative to the metaphysical principles for MCF proposed by Truesdell [1]. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
482 KiB  
Article
Entropic Approach to Multiscale Clustering Analysis
by Manlio De Domenico and Antonio Insolia
Entropy 2012, 14(5), 865-879; https://doi.org/10.3390/e14050865 - 09 May 2012
Cited by 4 | Viewed by 5788
Abstract
Recently, a novel method has been introduced to estimate the statistical significance of clustering in the direction distribution of objects. The method involves a multiscale procedure, based on the Kullback–Leibler divergence and the Gumbel statistics of extreme values, providing high discrimination power, even [...] Read more.
Recently, a novel method has been introduced to estimate the statistical significance of clustering in the direction distribution of objects. The method involves a multiscale procedure, based on the Kullback–Leibler divergence and the Gumbel statistics of extreme values, providing high discrimination power, even in presence of strong background isotropic contamination. It is shown that the method is: (i) semi-analytical, drastically reducing computation time; (ii) very sensitive to small, medium and large scale clustering; (iii) not biased against the null hypothesis. Applications to the physics of ultra-high energy cosmic rays, as a cosmological probe, are presented and discussed. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

111 KiB  
Article
A Comment on the Relation between Diffraction and Entropy
by Michael Baake and Uwe Grimm
Entropy 2012, 14(5), 856-864; https://doi.org/10.3390/e14050856 - 07 May 2012
Cited by 4 | Viewed by 5527
Abstract
Diffraction methods are used to detect atomic order in solids. While uniquely ergodic systems with pure point diffraction have zero entropy, the relation between diffraction and entropy is not as straightforward in general. In particular, there exist families of homometric systems, which are [...] Read more.
Diffraction methods are used to detect atomic order in solids. While uniquely ergodic systems with pure point diffraction have zero entropy, the relation between diffraction and entropy is not as straightforward in general. In particular, there exist families of homometric systems, which are systems sharing the same diffraction, with varying entropy. We summarise the present state of understanding by several characteristic examples. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
221 KiB  
Article
Self-Energy Closure for Inhomogeneous Turbulent Flows and Subgrid Modeling
by Jorgen S. Frederiksen
Entropy 2012, 14(4), 769-799; https://doi.org/10.3390/e14040769 - 18 Apr 2012
Cited by 18 | Viewed by 5410
Abstract
A new statistical dynamical closure theory for general inhomogeneous turbulent flows and subgrid modeling is presented. This Self-Energy (SE) closure represents all eddy interactions through nonlinear dissipation or forcing ‘self-energy’ terms in the mean-field, covariance and response function equations. This makes the renormalization [...] Read more.
A new statistical dynamical closure theory for general inhomogeneous turbulent flows and subgrid modeling is presented. This Self-Energy (SE) closure represents all eddy interactions through nonlinear dissipation or forcing ‘self-energy’ terms in the mean-field, covariance and response function equations. This makes the renormalization of the bare dissipation and forcing, and the subgrid modeling problem, transparent. The SE closure generalizes the quasi-diagonal direct interaction closure to allow for more complex interactions. The SE closure is applicable to flows in different geometries, is exact near maximum entropy states corresponding to canonical equilibrium, and provides a framework for deriving simpler realizable closures. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
175 KiB  
Article
Association of Finite-Time Thermodynamics and a Bond-Graph Approach for Modeling an Endoreversible Heat Engine
by Yuxiang Dong, Amin El-Bakkali, Georges Descombes, Michel Feidt and Christelle Périlhon
Entropy 2012, 14(4), 642-653; https://doi.org/10.3390/e14040642 - 28 Mar 2012
Cited by 10 | Viewed by 7708
Abstract
In recent decades, the approach known as Finite-Time Thermodynamics has provided a fruitful theoretical framework for the optimization of heat engines operating between a heat source (at temperature ) and a heat sink (at temperature ). The aim of this paper is [...] Read more.
In recent decades, the approach known as Finite-Time Thermodynamics has provided a fruitful theoretical framework for the optimization of heat engines operating between a heat source (at temperature ) and a heat sink (at temperature ). The aim of this paper is to propose a more complete approach based on the association of Finite-Time Thermodynamics and the Bond-Graph approach for modeling endoreversible heat engines. This approach makes it possible for example to find in a simple way the characteristics of the optimal operating point at which the maximum mechanical power of the endoreversible heat engine is obtained with entropy flow rate as control variable. Furthermore it provides the analytical expressions of the optimal operating point of an irreversible heat engine where the energy conversion is accompanied by irreversibilities related to internal heat transfer and heat dissipation phenomena. This original approach, applied to an analysis of the performance of a thermoelectric generator, will be the object of a future publication. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

119 KiB  
Article
Interval Entropy and Informative Distance
by Fakhroddin Misagh and Gholamhossein Yari
Entropy 2012, 14(3), 480-490; https://doi.org/10.3390/e14030480 - 02 Mar 2012
Cited by 33 | Viewed by 6774
Abstract
The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose [...] Read more.
The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
513 KiB  
Article
Scientific Élan Vital: Entropy Deficit or Inhomogeneity as a Unified Concept of Driving Forces of Life in Hierarchical Biosphere Driven by Photosynthesis
by Naoki Sato
Entropy 2012, 14(2), 233-251; https://doi.org/10.3390/e14020233 - 10 Feb 2012
Cited by 11 | Viewed by 10474
Abstract
Life is considered something different from non-living things, but no single driving force can account for all the different aspects of life, which consists of different levels of hierarchy, such as metabolism, cell physiology, multi-cellular development and organization, population dynamics, ecosystem, and evolution. [...] Read more.
Life is considered something different from non-living things, but no single driving force can account for all the different aspects of life, which consists of different levels of hierarchy, such as metabolism, cell physiology, multi-cellular development and organization, population dynamics, ecosystem, and evolution. Although free energy is evidently the driving force in biochemical reactions, there is no established relationship between metabolic energy and spatiotemporal organization of living organisms, or between metabolic energy and genetic information. Since Schrödinger pointed out the importance of exporting entropy in maintaining life, misunderstandings of entropy notion have been obstacles in constructing a unified view on the driving forces of life. Here I present a simplified conceptual framework for unifying driving forces of life at various different levels of hierarchy. The key concept is “entropy deficit”, or simply, ‘inhomogeneity’, which is defined as the difference of maximal possible entropy and actual entropy. This is equivalent to information content in genetic information and protein structure, and is also defined similarly for non-homogeneous structures in ecosystems and evolution. Entropy deficit or inhomogeneoity is a unified measure of all driving forces of life, which could be considered a scientific equivalent to ‘élan vital’ of Bergson. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

83 KiB  
Article
On the Role of Entropy Generation in Processes Involving Fatigue
by Mehdi Amiri and M. M. Khonsari
Entropy 2012, 14(1), 24-31; https://doi.org/10.3390/e14010024 - 30 Dec 2011
Cited by 71 | Viewed by 7123
Abstract
In this paper we describe the potential of employing the concept of thermodynamic entropy generation to assess degradation in processes involving metal fatigue. It is shown that empirical fatigue models such as Miner’s rule, Coffin-Manson equation, and Paris law can be deduced from [...] Read more.
In this paper we describe the potential of employing the concept of thermodynamic entropy generation to assess degradation in processes involving metal fatigue. It is shown that empirical fatigue models such as Miner’s rule, Coffin-Manson equation, and Paris law can be deduced from thermodynamic consideration. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

153 KiB  
Article
Eigenvalue Estimates Using the Kolmogorov-Sinai Entropy
by Shih-Feng Shieh
Entropy 2011, 13(12), 2036-2048; https://doi.org/10.3390/e13122036 - 20 Dec 2011
Cited by 2 | Viewed by 5442
Abstract
The scope of this paper is twofold. First, we use the Kolmogorov-Sinai Entropy to estimate lower bounds for dominant eigenvalues of nonnegative matrices. The lower bound is better than the Rayleigh quotient. Second, we use this estimate to give a nontrivial lower bound [...] Read more.
The scope of this paper is twofold. First, we use the Kolmogorov-Sinai Entropy to estimate lower bounds for dominant eigenvalues of nonnegative matrices. The lower bound is better than the Rayleigh quotient. Second, we use this estimate to give a nontrivial lower bound for the gaps of dominant eigenvalues of A and A + V. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Show Figures

Figure 1

Back to TopTop