E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Concepts of Entropy and Their Applications"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 March 2012)

Special Issue Editors

Guest Editor
Prof. Dr. Philip Broadbridge

Department of Mathematics and Statistics, La Trobe University Melbourne, VIC 3086, Australia
Website | E-Mail
Fax: +1 3 9479 3060
Interests: concepts of entropy and their applications
Guest Editor
Prof. Dr. Tony Guttmann

The University of Melbourne, Parkville, VIC 3010
Website | E-Mail

Special Issue Information

Dear Colleagues,

The original papers in this Special Issue arose from a meeting of the AMSI-MASCOS Theme Program, Concepts of Entropy and their Applications, which took place at Melbourne Australia, November - December, 2007. During the four years that have elapsed since that meeting, research on this topic has progressed significantly.

Although there are various concepts of entropy, they provide unifying insights in a wide variety of both topics and scale. Topics include information theory, forecasting, optimization, irreversible processes, thermodynamics and statistical mechanics, while the scale ranges from the atomic, through the macroscopic to the cosmological.

We feel that it is a good time to call for new contributions from all potential authors. We hope that some of the original authors who had an association with the theme program will provide updates on progress. Other readers of the original issue may be stimulated to provide complementary or supporting material of their own. Finally we welcome new authors who believe that their own work relates well to the overall theme.

Prof. Dr. Philip Broadbridge
Prof. Dr. Tony Guttmann
Guest Editors

Keywords

  • Clausius entropy
  • Gibbs entropy
  • Boltzmann entropy
  • von Neumann entropy
  • Tsallis entropy
  • entropy and information theory
  • black hole entropy
  • entropy in environmental modelling

Published Papers (22 papers)

View options order results:
result details:
Displaying articles 1-22
Export citation of selected articles as:

Research

Open AccessArticle The Extension of Statistical Entropy Analysis to Chemical Compounds
Entropy 2012, 14(12), 2413-2426; doi:10.3390/e14122413
Received: 23 August 2012 / Revised: 25 September 2012 / Accepted: 26 November 2012 / Published: 28 November 2012
Cited by 3 | PDF Full-text (524 KB) | HTML Full-text | XML Full-text
Abstract
Statistical entropy analysis (SEA) quantifies the dilution and concentration of conservative substances (e.g., heavy metals) in a system. In this paper, the SEA concept is extended (eSEA) to make it applicable to systems in which the chemical speciation is of particular importance. The
[...] Read more.
Statistical entropy analysis (SEA) quantifies the dilution and concentration of conservative substances (e.g., heavy metals) in a system. In this paper, the SEA concept is extended (eSEA) to make it applicable to systems in which the chemical speciation is of particular importance. The eSEA is applied to a simplified region used for crop farming. The extent to which the region concentrates or dilutes nitrogen compounds is expressed as the change in statistical entropy (DH). A detailed derivation for the calculation of DH is provided. The results are discussed for four variations of the crop farming system, showing that the efficiency of crop farming can be expressed by eSEA. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Permutation Entropy and Its Main Biomedical and Econophysics Applications: A Review
Entropy 2012, 14(8), 1553-1577; doi:10.3390/e14081553
Received: 1 July 2012 / Revised: 10 August 2012 / Accepted: 21 August 2012 / Published: 23 August 2012
Cited by 104 | PDF Full-text (365 KB) | HTML Full-text | XML Full-text
Abstract
Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics,
[...] Read more.
Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, an aspect which is not usually taken into account. The idea of calculating entropy based on permutation patterns (that is, permutations defined by the order relations among values of a time series) has received a lot of attention in the last years, especially for the understanding of complex and chaotic systems. Permutation entropy directly accounts for the temporal information contained in the time series; furthermore, it has the quality of simplicity, robustness and very low computational cost. To celebrate the tenth anniversary of the original work, here we analyze the theoretical foundations of the permutation entropy, as well as the main recent applications to the analysis of economical markets and to the understanding of biomedical systems. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle A New Entropy Optimization Model for Graduation of Data in Survival Analysis
Entropy 2012, 14(8), 1306-1316; doi:10.3390/e14081306
Received: 7 March 2012 / Revised: 6 June 2012 / Accepted: 4 July 2012 / Published: 25 July 2012
PDF Full-text (649 KB) | HTML Full-text | XML Full-text
Abstract
Graduation of data is of great importance in survival analysis. Smoothness and goodness of fit are two fundamental requirements in graduation. Based on the instinctive defining expression for entropy in terms of a probability distribution, two optimization models based on the Maximum Entropy
[...] Read more.
Graduation of data is of great importance in survival analysis. Smoothness and goodness of fit are two fundamental requirements in graduation. Based on the instinctive defining expression for entropy in terms of a probability distribution, two optimization models based on the Maximum Entropy Principle (MaxEnt) and Minimum Cross Entropy Principle (MinCEnt) to estimate mortality probability distributions are presented. The results demonstrate that the two approaches achieve the two basic requirements of data graduating, smoothness and goodness of fit respectively. Then, in order to achieve a compromise between these requirements, a new entropy optimization model is proposed by defining a hybrid objective function combining both principles of MaxEnt and MinCEnt models linked by a given adjustment factor which reflects the preference of smoothness and goodness of fit in the data graduation. The proposed approach is feasible and more reasonable in data graduation when both smoothness and goodness of fit are concerned. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Socio-Thermodynamics—Evolutionary Potentials in a Population of Hawks and Doves
Entropy 2012, 14(7), 1285-1295; doi:10.3390/e14071285
Received: 3 May 2012 / Revised: 25 June 2012 / Accepted: 4 July 2012 / Published: 23 July 2012
PDF Full-text (1402 KB) | HTML Full-text | XML Full-text
Abstract
The socio-thermodynamics of a population of two competing species exhibits strong analogies with the thermodynamics of solutions and alloys of two constituents. In particular we may construct strategy diagrams akin to the phase diagrams of chemical thermodynamics, complete with regions of homogeneous mixing
[...] Read more.
The socio-thermodynamics of a population of two competing species exhibits strong analogies with the thermodynamics of solutions and alloys of two constituents. In particular we may construct strategy diagrams akin to the phase diagrams of chemical thermodynamics, complete with regions of homogeneous mixing and miscibility gaps. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Quantum Dynamical Entropies and Gács Algorithmic Entropy
Entropy 2012, 14(7), 1259-1273; doi:10.3390/e14071259
Received: 13 April 2012 / Revised: 8 June 2012 / Accepted: 3 July 2012 / Published: 12 July 2012
Cited by 2 | PDF Full-text (123 KB) | HTML Full-text | XML Full-text
Abstract
Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical) entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step
[...] Read more.
Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical) entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step along typical trajectories of a classical ergodic system equals the KS-entropy. In the following, we establish a similar relation between the Connes–Narnhofer–Thirring quantum dynamical entropy for the shift on quantum spin chains and the Gács algorithmic entropy. We further provide, for the same system, a weaker linkage between the latter algorithmic complexity and a different quantum dynamical entropy proposed by Alicki and Fannes. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Nonparametric Estimation of Information-Based Measures of Statistical Dispersion
Entropy 2012, 14(7), 1221-1233; doi:10.3390/e14071221
Received: 29 March 2012 / Revised: 20 June 2012 / Accepted: 4 July 2012 / Published: 10 July 2012
Cited by 5 | PDF Full-text (289 KB) | HTML Full-text | XML Full-text
Abstract
We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable
[...] Read more.
We address the problem of non-parametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the “spread” or “variability” of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Multivariate Multi-Scale Permutation Entropy for Complexity Analysis of Alzheimer’s Disease EEG
Entropy 2012, 14(7), 1186-1202; doi:10.3390/e14071186
Received: 1 April 2012 / Revised: 21 June 2012 / Accepted: 26 June 2012 / Published: 4 July 2012
Cited by 51 | PDF Full-text (279 KB) | HTML Full-text | XML Full-text
Abstract
An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation
[...] Read more.
An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation Entropy, a methodology for time series processing based on ordinal analysis. Permutation Entropy is conceptually simple, structurally robust to noise and artifacts, computationally very fast, which is relevant for designing portable diagnostics. Since time series derived from biological systems show structures on multiple spatial-temporal scales, the proposed technique can be useful for other types of biomedical signal analysis. In this work, the possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Fourth Order Diffusion Equations with Increasing Entropy
Entropy 2012, 14(7), 1127-1139; doi:10.3390/e14071127
Received: 11 April 2012 / Revised: 19 June 2012 / Accepted: 19 June 2012 / Published: 25 June 2012
Cited by 1 | PDF Full-text (130 KB) | HTML Full-text | XML Full-text
Abstract
The general quasi-linear autonomous fourth order diffusion equation ut = −[G(u)uxxx + h(u, ux, uxx)]x with positive variable diffusivity G(u) and lower-order flux component h is considered on the real line. A direct algorithm produces a general class of equations
[...] Read more.
The general quasi-linear autonomous fourth order diffusion equation ut = −[G(u)uxxx + h(u, ux, uxx)]x with positive variable diffusivity G(u) and lower-order flux component h is considered on the real line. A direct algorithm produces a general class of equations for which the Shannon entropy density obeys a reaction-diffusion equation with a positive irreducible source term. Such equations may have any positive twice-differentiable diffusivity function G(u). The forms of such equations are the indicators of more general conservation equations whose entropy equation may be expressed in an alternative reaction-diffusion form whose source term, although reducible, is positive. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties
Entropy 2012, 14(6), 1103-1126; doi:10.3390/e14061103
Received: 20 May 2012 / Revised: 8 June 2012 / Accepted: 15 June 2012 / Published: 19 June 2012
Cited by 7 | PDF Full-text (663 KB) | HTML Full-text | XML Full-text
Abstract
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of
[...] Read more.
The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Deterministic Thermal Reservoirs
Entropy 2012, 14(6), 1011-1027; doi:10.3390/e14061011
Received: 21 March 2012 / Revised: 16 May 2012 / Accepted: 5 June 2012 / Published: 8 June 2012
Cited by 4 | PDF Full-text (6645 KB) | HTML Full-text | XML Full-text
Abstract
We explore the consequences of a deterministic microscopic thermostat-reservoir contact mechanism for hard disks where the collision rule at the boundary is modified. Numerical evidence and theoretical argument is given that suggests that an energy balance is achieved for a system of hard
[...] Read more.
We explore the consequences of a deterministic microscopic thermostat-reservoir contact mechanism for hard disks where the collision rule at the boundary is modified. Numerical evidence and theoretical argument is given that suggests that an energy balance is achieved for a system of hard disks in contact with two reservoirs at equal temperatures. This system however produces entropy near the the system-reservoir boundaries and this entropy flows into the two reservoirs. Thus rather than producing an equilibrium state, the system is at a steady state with a steady entropy flow without any associated energy flux. The microscopic mechanisms associated with energy and entropy fluxes for this system are examined in detail. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Adaptive Computation of Multiscale Entropy and Its Application in EEG Signals for Monitoring Depth of Anesthesia During Surgery
Entropy 2012, 14(6), 978-992; doi:10.3390/e14060978
Received: 29 March 2012 / Revised: 9 May 2012 / Accepted: 21 May 2012 / Published: 25 May 2012
Cited by 22 | PDF Full-text (471 KB) | HTML Full-text | XML Full-text
Abstract
Entropy as an estimate of complexity of the electroencephalogram is an effective parameter for monitoring the depth of anesthesia (DOA) during surgery. Multiscale entropy (MSE) is useful to evaluate the complexity of signals over different time scales. However, the limitation of the length
[...] Read more.
Entropy as an estimate of complexity of the electroencephalogram is an effective parameter for monitoring the depth of anesthesia (DOA) during surgery. Multiscale entropy (MSE) is useful to evaluate the complexity of signals over different time scales. However, the limitation of the length of processed signal is a problem due to observing the variation of sample entropy (SE) on different scales. In this study, the adaptive resampling procedure is employed to replace the process of coarse-graining in MSE. According to the analysis of various signals and practical EEG signals, it is feasible to calculate the SE from the adaptive resampled signals, and it has the highly similar results with the original MSE at small scales. The distribution of the MSE of EEG during the whole surgery based on adaptive resampling process is able to show the detailed variation of SE in small scales and complexity of EEG, which could help anesthesiologists evaluate the status of patients. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Figures

Open AccessArticle Entropy Concept for Paramacrosystems with Complex States
Entropy 2012, 14(5), 924-944; doi:10.3390/e14050924
Received: 23 February 2012 / Revised: 18 April 2012 / Accepted: 26 April 2012 / Published: 10 May 2012
PDF Full-text (1638 KB) | HTML Full-text | XML Full-text
Abstract
Consideration is given to macrosystems called paramacrosystems with states of finite capacity and distinguishable and undistinguishable elements with stochastic behavior. The paramacrosystems fill a gap between Fermi and Einstein macrosystems. Using the method of the generating functions, we have obtained expressions for probabilistic
[...] Read more.
Consideration is given to macrosystems called paramacrosystems with states of finite capacity and distinguishable and undistinguishable elements with stochastic behavior. The paramacrosystems fill a gap between Fermi and Einstein macrosystems. Using the method of the generating functions, we have obtained expressions for probabilistic characteristics (distribution of the macrostate probabilities, physical and information entropies) of the paramacrosystems. The cases with equal and unequal prior probabilities for elements to occupy the states with finite capacities are considered. The unequal prior probabilities influence the morphological properties of the entropy functions and the functions of the macrostate probabilities, transforming them in the multimodal functions. The examples of the paramacrosystems with two-modal functions of the entropy and distribution of the macrostate probabilities are presented. The variation principle does not work for such cases. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle An Entropic Estimator for Linear Inverse Problems
Entropy 2012, 14(5), 892-923; doi:10.3390/e14050892
Received: 29 February 2012 / Revised: 2 April 2012 / Accepted: 17 April 2012 / Published: 10 May 2012
Cited by 3 | PDF Full-text (260 KB) | HTML Full-text | XML Full-text
Abstract
In this paper we examine an Information-Theoretic method for solving noisy linear inverse estimation problems which encompasses under a single framework a whole class of estimation methods. Under this framework, the prior information about the unknown parameters (when such information exists), and constraints
[...] Read more.
In this paper we examine an Information-Theoretic method for solving noisy linear inverse estimation problems which encompasses under a single framework a whole class of estimation methods. Under this framework, the prior information about the unknown parameters (when such information exists), and constraints on the parameters can be incorporated in the statement of the problem. The method builds on the basics of the maximum entropy principle and consists of transforming the original problem into an estimation of a probability density on an appropriate space naturally associated with the statement of the problem. This estimation method is generic in the sense that it provides a framework for analyzing non-normal models, it is easy to implement and is suitable for all types of inverse problems such as small and or ill-conditioned, noisy data. First order approximation, large sample properties and convergence in distribution are developed as well. Analytical examples, statistics for model comparisons and evaluations, that are inherent to this method, are discussed and complemented with explicit examples. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Entropic Approach to Multiscale Clustering Analysis
Entropy 2012, 14(5), 865-879; doi:10.3390/e14050865
Received: 20 February 2012 / Revised: 3 May 2012 / Accepted: 4 May 2012 / Published: 9 May 2012
Cited by 3 | PDF Full-text (482 KB) | HTML Full-text | XML Full-text
Abstract
Recently, a novel method has been introduced to estimate the statistical significance of clustering in the direction distribution of objects. The method involves a multiscale procedure, based on the Kullback–Leibler divergence and the Gumbel statistics of extreme values, providing high discrimination power, even
[...] Read more.
Recently, a novel method has been introduced to estimate the statistical significance of clustering in the direction distribution of objects. The method involves a multiscale procedure, based on the Kullback–Leibler divergence and the Gumbel statistics of extreme values, providing high discrimination power, even in presence of strong background isotropic contamination. It is shown that the method is: (i) semi-analytical, drastically reducing computation time; (ii) very sensitive to small, medium and large scale clustering; (iii) not biased against the null hypothesis. Applications to the physics of ultra-high energy cosmic rays, as a cosmological probe, are presented and discussed. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Second Law Constraints on the Dynamics of a Mixture of Two Fluids at Different Temperatures
Entropy 2012, 14(5), 880-891; doi:10.3390/e14050880
Received: 1 March 2012 / Revised: 24 April 2012 / Accepted: 3 May 2012 / Published: 9 May 2012
Cited by 2 | PDF Full-text (207 KB) | HTML Full-text | XML Full-text
Abstract
Constitutive laws for multi-component fluids (MCF) is one of the thorniest problems in science. Two questions explored here are: how to ensure that these relations reduce to accepted forms when all but one of the constituents vanishes; and what constraints does the Second
[...] Read more.
Constitutive laws for multi-component fluids (MCF) is one of the thorniest problems in science. Two questions explored here are: how to ensure that these relations reduce to accepted forms when all but one of the constituents vanishes; and what constraints does the Second Law impose on the dynamics of viscous fluids at different temperatures? The analysis suggests an alternative to the metaphysical principles for MCF proposed by Truesdell [1]. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle A Comment on the Relation between Diffraction and Entropy
Entropy 2012, 14(5), 856-864; doi:10.3390/e14050856
Received: 26 March 2012 / Accepted: 29 April 2012 / Published: 7 May 2012
Cited by 3 | PDF Full-text (111 KB) | HTML Full-text | XML Full-text
Abstract
Diffraction methods are used to detect atomic order in solids. While uniquely ergodic systems with pure point diffraction have zero entropy, the relation between diffraction and entropy is not as straightforward in general. In particular, there exist families of homometric systems, which are
[...] Read more.
Diffraction methods are used to detect atomic order in solids. While uniquely ergodic systems with pure point diffraction have zero entropy, the relation between diffraction and entropy is not as straightforward in general. In particular, there exist families of homometric systems, which are systems sharing the same diffraction, with varying entropy. We summarise the present state of understanding by several characteristic examples. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Self-Energy Closure for Inhomogeneous Turbulent Flows and Subgrid Modeling
Entropy 2012, 14(4), 769-799; doi:10.3390/e14040769
Received: 14 March 2012 / Revised: 10 April 2012 / Accepted: 11 April 2012 / Published: 18 April 2012
Cited by 8 | PDF Full-text (221 KB) | HTML Full-text | XML Full-text
Abstract
A new statistical dynamical closure theory for general inhomogeneous turbulent flows and subgrid modeling is presented. This Self-Energy (SE) closure represents all eddy interactions through nonlinear dissipation or forcing ‘self-energy’ terms in the mean-field, covariance and response function equations. This makes the renormalization
[...] Read more.
A new statistical dynamical closure theory for general inhomogeneous turbulent flows and subgrid modeling is presented. This Self-Energy (SE) closure represents all eddy interactions through nonlinear dissipation or forcing ‘self-energy’ terms in the mean-field, covariance and response function equations. This makes the renormalization of the bare dissipation and forcing, and the subgrid modeling problem, transparent. The SE closure generalizes the quasi-diagonal direct interaction closure to allow for more complex interactions. The SE closure is applicable to flows in different geometries, is exact near maximum entropy states corresponding to canonical equilibrium, and provides a framework for deriving simpler realizable closures. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Association of Finite-Time Thermodynamics and a Bond-Graph Approach for Modeling an Endoreversible Heat Engine
Entropy 2012, 14(4), 642-653; doi:10.3390/e14040642
Received: 16 January 2012 / Revised: 13 March 2012 / Accepted: 23 March 2012 / Published: 28 March 2012
Cited by 3 | PDF Full-text (175 KB) | HTML Full-text | XML Full-text
Abstract
In recent decades, the approach known as Finite-Time Thermodynamics has provided a fruitful theoretical framework for the optimization of heat engines operating between a heat source (at temperature ) and a heat sink (at temperature ). The aim of this paper is
[...] Read more.
In recent decades, the approach known as Finite-Time Thermodynamics has provided a fruitful theoretical framework for the optimization of heat engines operating between a heat source (at temperature ) and a heat sink (at temperature ). The aim of this paper is to propose a more complete approach based on the association of Finite-Time Thermodynamics and the Bond-Graph approach for modeling endoreversible heat engines. This approach makes it possible for example to find in a simple way the characteristics of the optimal operating point at which the maximum mechanical power of the endoreversible heat engine is obtained with entropy flow rate as control variable. Furthermore it provides the analytical expressions of the optimal operating point of an irreversible heat engine where the energy conversion is accompanied by irreversibilities related to internal heat transfer and heat dissipation phenomena. This original approach, applied to an analysis of the performance of a thermoelectric generator, will be the object of a future publication. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Interval Entropy and Informative Distance
Entropy 2012, 14(3), 480-490; doi:10.3390/e14030480
Received: 20 December 2011 / Revised: 4 February 2012 / Accepted: 7 February 2012 / Published: 2 March 2012
Cited by 8 | PDF Full-text (119 KB) | HTML Full-text | XML Full-text
Abstract
The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose
[...] Read more.
The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Scientific Élan Vital: Entropy Deficit or Inhomogeneity as a Unified Concept of Driving Forces of Life in Hierarchical Biosphere Driven by Photosynthesis
Entropy 2012, 14(2), 233-251; doi:10.3390/e14020233
Received: 12 December 2011 / Revised: 22 January 2012 / Accepted: 7 February 2012 / Published: 10 February 2012
Cited by 6 | PDF Full-text (513 KB) | HTML Full-text | XML Full-text
Abstract
Life is considered something different from non-living things, but no single driving force can account for all the different aspects of life, which consists of different levels of hierarchy, such as metabolism, cell physiology, multi-cellular development and organization, population dynamics, ecosystem, and evolution.
[...] Read more.
Life is considered something different from non-living things, but no single driving force can account for all the different aspects of life, which consists of different levels of hierarchy, such as metabolism, cell physiology, multi-cellular development and organization, population dynamics, ecosystem, and evolution. Although free energy is evidently the driving force in biochemical reactions, there is no established relationship between metabolic energy and spatiotemporal organization of living organisms, or between metabolic energy and genetic information. Since Schrödinger pointed out the importance of exporting entropy in maintaining life, misunderstandings of entropy notion have been obstacles in constructing a unified view on the driving forces of life. Here I present a simplified conceptual framework for unifying driving forces of life at various different levels of hierarchy. The key concept is “entropy deficit”, or simply, ‘inhomogeneity’, which is defined as the difference of maximal possible entropy and actual entropy. This is equivalent to information content in genetic information and protein structure, and is also defined similarly for non-homogeneous structures in ecosystems and evolution. Entropy deficit or inhomogeneoity is a unified measure of all driving forces of life, which could be considered a scientific equivalent to ‘élan vital’ of Bergson. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle On the Role of Entropy Generation in Processes Involving Fatigue
Entropy 2012, 14(1), 24-31; doi:10.3390/e14010024
Received: 17 October 2011 / Revised: 24 November 2011 / Accepted: 11 December 2011 / Published: 30 December 2011
Cited by 22 | PDF Full-text (83 KB) | HTML Full-text | XML Full-text
Abstract
In this paper we describe the potential of employing the concept of thermodynamic entropy generation to assess degradation in processes involving metal fatigue. It is shown that empirical fatigue models such as Miner’s rule, Coffin-Manson equation, and Paris law can be deduced from
[...] Read more.
In this paper we describe the potential of employing the concept of thermodynamic entropy generation to assess degradation in processes involving metal fatigue. It is shown that empirical fatigue models such as Miner’s rule, Coffin-Manson equation, and Paris law can be deduced from thermodynamic consideration. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
Open AccessArticle Eigenvalue Estimates Using the Kolmogorov-Sinai Entropy
Entropy 2011, 13(12), 2036-2048; doi:10.3390/e13122036
Received: 31 October 2011 / Revised: 28 November 2011 / Accepted: 12 December 2011 / Published: 20 December 2011
Cited by 1 | PDF Full-text (153 KB) | HTML Full-text | XML Full-text
Abstract
The scope of this paper is twofold. First, we use the Kolmogorov-Sinai Entropy to estimate lower bounds for dominant eigenvalues of nonnegative matrices. The lower bound is better than the Rayleigh quotient. Second, we use this estimate to give a nontrivial lower bound
[...] Read more.
The scope of this paper is twofold. First, we use the Kolmogorov-Sinai Entropy to estimate lower bounds for dominant eigenvalues of nonnegative matrices. The lower bound is better than the Rayleigh quotient. Second, we use this estimate to give a nontrivial lower bound for the gaps of dominant eigenvalues of A and A + V. Full article
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top