Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 12, Issue 5 (May 2010), Pages 996-1324

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-12
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle The Maximum Entropy Production Principle and Linear Irreversible Processes
Entropy 2010, 12(5), 996-1005; doi:10.3390/e12050996
Received: 13 March 2010 / Accepted: 23 April 2010 / Published: 27 April 2010
Cited by 10 | PDF Full-text (107 KB) | HTML Full-text | XML Full-text
Abstract
It is shown that Onsager’s principle of the least dissipation of energy is equivalent to the maximum entropy production principle. It is known that solutions of the linearized Boltzmann equation make extrema of entropy production. It is argued, in the case of [...] Read more.
It is shown that Onsager’s principle of the least dissipation of energy is equivalent to the maximum entropy production principle. It is known that solutions of the linearized Boltzmann equation make extrema of entropy production. It is argued, in the case of stationary processes, that this extremum is a maximum rather than a minimum. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Open AccessArticle Multi-Criteria Evaluation of Energy Systems with Sustainability Considerations
Entropy 2010, 12(5), 1006-1020; doi:10.3390/e12051006
Received: 25 February 2010 / Revised: 8 March 2010 / Accepted: 23 April 2010 / Published: 27 April 2010
Cited by 19 | PDF Full-text (6551 KB) | HTML Full-text | XML Full-text
Abstract
A multi-criteria approach is presented for the assessment of alternative means for covering the energy needs (electricity and heat) of an industrial unit, taking into consideration sustainability aspects. The procedure is first described in general terms: proper indicators are defined; next they [...] Read more.
A multi-criteria approach is presented for the assessment of alternative means for covering the energy needs (electricity and heat) of an industrial unit, taking into consideration sustainability aspects. The procedure is first described in general terms: proper indicators are defined; next they are grouped in order to form sub-indices, which are then used to determine the composite sustainability index. The procedure is applied for the evaluation of three alternative systems. The three systems are placed in order of preference, which depends on the criteria used. In addition to conclusions reached as a result of the particular case study, recommendations for future work are given. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Functional Information: Towards Synthesis of Biosemiotics and Cybernetics
Entropy 2010, 12(5), 1050-1070; doi:10.3390/e12051050
Received: 10 March 2010 / Revised: 6 April 2010 / Accepted: 21 April 2010 / Published: 27 April 2010
Cited by 15 | PDF Full-text (183 KB) | HTML Full-text | XML Full-text
Abstract
Biosemiotics and cybernetics are closely related, yet they are separated by the boundary between life and non-life: biosemiotics is focused on living organisms, whereas cybernetics is applied mostly to non-living artificial devices. However, both classes of systems are agents that perform functions [...] Read more.
Biosemiotics and cybernetics are closely related, yet they are separated by the boundary between life and non-life: biosemiotics is focused on living organisms, whereas cybernetics is applied mostly to non-living artificial devices. However, both classes of systems are agents that perform functions necessary for reaching their goals. I propose to shift the focus of biosemiotics from living organisms to agents in general, which all belong to a pragmasphere or functional universe. Agents should be considered in the context of their hierarchy and origin because their semiosis can be inherited or induced by higher-level agents. To preserve and disseminate their functions, agents use functional information - a set of signs that encode and control their functions. It includes stable memory signs, transient messengers, and natural signs. The origin and evolution of functional information is discussed in terms of transitions between vegetative, animal, and social levels of semiosis, defined by Kull. Vegetative semiosis differs substantially from higher levels of semiosis, because signs are recognized and interpreted via direct code-based matching and are not associated with ideal representations of objects. Thus, I consider a separate classification of signs at the vegetative level that includes proto-icons, proto-indexes, and proto-symbols. Animal and social semiosis are based on classification, and modeling of objects, which represent the knowledge of agents about their body (Innenwelt) and environment (Umwelt). Full article
Figures

Open AccessArticle On the Interplay between Entropy and Robustness of Gene Regulatory Networks
Entropy 2010, 12(5), 1071-1101; doi:10.3390/e12051071
Received: 2 March 2010 / Revised: 10 April 2010 / Accepted: 28 April 2010 / Published: 4 May 2010
Cited by 11 | PDF Full-text (362 KB) | HTML Full-text | XML Full-text
Abstract
The interplay between entropy and robustness of gene network is a core mechanism of systems biology. The entropy is a measure of randomness or disorder of a physical system due to random parameter fluctuation and environmental noises in gene regulatory networks. The [...] Read more.
The interplay between entropy and robustness of gene network is a core mechanism of systems biology. The entropy is a measure of randomness or disorder of a physical system due to random parameter fluctuation and environmental noises in gene regulatory networks. The robustness of a gene regulatory network, which can be measured as the ability to tolerate the random parameter fluctuation and to attenuate the effect of environmental noise, will be discussed from the robust H stabilization and filtering perspective. In this review, we will also discuss their balancing roles in evolution and potential applications in systems and synthetic biology. Full article
(This article belongs to the Special Issue Entropy in Genetics and Computational Biology)
Open AccessArticle Learning Genetic Population Structures Using Minimization of Stochastic Complexity
Entropy 2010, 12(5), 1102-1124; doi:10.3390/e12051102
Received: 21 February 2010 / Accepted: 28 April 2010 / Published: 5 May 2010
PDF Full-text (329 KB) | HTML Full-text | XML Full-text
Abstract
Considerable research efforts have been devoted to probabilistic modeling of genetic population structures within the past decade. In particular, a wide spectrum of Bayesian models have been proposed for unlinked molecular marker data from diploid organisms. Here we derive a theoretical framework [...] Read more.
Considerable research efforts have been devoted to probabilistic modeling of genetic population structures within the past decade. In particular, a wide spectrum of Bayesian models have been proposed for unlinked molecular marker data from diploid organisms. Here we derive a theoretical framework for learning genetic population structure of a haploid organism from bi-allelic markers for which potential patterns of dependence are a priori unknown and to be explicitly incorporated in the model. Our framework is based on the principle of minimizing stochastic complexity of an unsupervised classification under tree augmented factorization of the predictive data distribution. We discuss a fast implementation of the learning framework using deterministic algorithms. Full article
(This article belongs to the Special Issue Entropy in Genetics and Computational Biology)
Open AccessArticle Nearest Neighbor Estimates of Entropy for Multivariate Circular Distributions
Entropy 2010, 12(5), 1125-1144; doi:10.3390/e12051125
Received: 26 February 2010 / Accepted: 29 April 2010 / Published: 6 May 2010
Cited by 8 | PDF Full-text (372 KB) | HTML Full-text | XML Full-text
Abstract
In molecular sciences, the estimation of entropies of molecules is important for the understanding of many chemical and biological processes. Motivated by these applications, we consider the problem of estimating the entropies of circular random vectors and introduce non-parametric estimators based on [...] Read more.
In molecular sciences, the estimation of entropies of molecules is important for the understanding of many chemical and biological processes. Motivated by these applications, we consider the problem of estimating the entropies of circular random vectors and introduce non-parametric estimators based on circular distances between n sample points and their k th nearest neighbors (NN), where k (≤ n – 1) is a fixed positive integer. The proposed NN estimators are based on two different circular distances, and are proven to be asymptotically unbiased and consistent. The performance of one of the circular-distance estimators is investigated and compared with that of the already established Euclidean-distance NN estimator using Monte Carlo samples from an analytic distribution of six circular variables of an exactly known entropy and a large sample of seven internal-rotation angles in the molecule of tartaric acid, obtained by a realistic molecular-dynamics simulation. Full article
(This article belongs to the Special Issue Configurational Entropy)
Open AccessArticle Entropy: The Markov Ordering Approach
Entropy 2010, 12(5), 1145-1193; doi:10.3390/e12051145
Received: 1 March 2010 / Revised: 30 April 2010 / Accepted: 4 May 2010 / Published: 7 May 2010
Cited by 32 | PDF Full-text (539 KB)
Abstract
The focus of this article is on entropy and Markov processes. We study the properties of functionals which are invariant with respect to monotonic transformations and analyze two invariant “additivity” properties: (i) existence of a monotonic transformation which makes the functional additive [...] Read more.
The focus of this article is on entropy and Markov processes. We study the properties of functionals which are invariant with respect to monotonic transformations and analyze two invariant “additivity” properties: (i) existence of a monotonic transformation which makes the functional additive with respect to the joining of independent systems and (ii) existence of a monotonic transformation which makes the functional additive with respect to the partitioning of the space of states. All Lyapunov functionals for Markov chains which have properties (i) and (ii) are derived. We describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). The solution differs significantly from the ordering given by the inequality of entropy growth. For inference, this approach results in a convex compact set of conditionally “most random” distributions. Full article
(This article belongs to the Special Issue Entropy in Model Reduction)
Figures

Open AccessArticle Entropy and Phase Coexistence in Clusters: Metals vs. Nonmetals
Entropy 2010, 12(5), 1303-1324; doi:10.3390/e12051303
Received: 10 March 2010 / Revised: 18 May 2010 / Accepted: 19 May 2010 / Published: 25 May 2010
Cited by 8 | PDF Full-text (990 KB) | HTML Full-text | XML Full-text
Abstract
Small clusters violate the Gibbs phase rule by exhibiting two or more phases in thermodynamic equilibrium over bands of temperature and pressure. The reason is the small number of particles comprising each system. We review recent results concerning the size ranges for [...] Read more.
Small clusters violate the Gibbs phase rule by exhibiting two or more phases in thermodynamic equilibrium over bands of temperature and pressure. The reason is the small number of particles comprising each system. We review recent results concerning the size ranges for which this behavior is observable. The principal characteristic determining the coexistence range is the transitions entropy change. We review how this happens, using simulations of 13-atom Lennard-Jones and metal clusters to compare dielectric clusters with the more complex clusters of metal atoms. The dominating difference between the narrower coexistence bands of dielectrics and the wider bands of metal clusters is the much higher configurational entropy of the liquid metal clusters. Full article
(This article belongs to the Special Issue Configurational Entropy)

Review

Jump to: Research

Open AccessReview On the Thermodynamics of Friction and Wear―A Review
Entropy 2010, 12(5), 1021-1049; doi:10.3390/e12051021
Received: 17 March 2010 / Revised: 18 April 2010 / Accepted: 19 April 2010 / Published: 27 April 2010
Cited by 44 | PDF Full-text (612 KB) | HTML Full-text | XML Full-text
Abstract An extensive survey of the papers pertaining to the thermodynamic approach to tribosystems, particularly using the concept of entropy as a natural time base, is presented with a summary of the important contributions of leading researchers. Full article
(This article belongs to the Special Issue Entropy and Friction Volume 2)
Open AccessReview Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics
Entropy 2010, 12(5), 1194-1245; doi:10.3390/e12051194
Received: 10 February 2010 / Accepted: 30 April 2010 / Published: 7 May 2010
Cited by 6 | PDF Full-text (525 KB)
Abstract
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the [...] Read more.
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60). Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessReview Black Hole Entropy in Scalar-Tensor and ƒ(R) Gravity: An Overview
Entropy 2010, 12(5), 1246-1263; doi:10.3390/e12051246
Received: 9 April 2010 / Accepted: 13 May 2010 / Published: 14 May 2010
Cited by 22 | PDF Full-text (182 KB) | HTML Full-text | XML Full-text
Abstract A short overview of black hole entropy in alternative gravitational theories is presented. Motivated by the recent attempts to explain the cosmic acceleration without dark energy, we focus on metric and Palatini ƒ(R) gravity and on scalar-tensor theories. Full article
(This article belongs to the Special Issue Entropy in Quantum Gravity)
Open AccessReview Semantic Networks: Structure and Dynamics
Entropy 2010, 12(5), 1264-1302; doi:10.3390/e12051264
Received: 21 February 2010 / Accepted: 1 May 2010 / Published: 14 May 2010
Cited by 48 | PDF Full-text (1439 KB)
Abstract
During the last ten years several studies have appeared regarding language complexity. Research on this issue began soon after the burst of a new movement of interest and research in the study of complex networks, i.e., networks whose structure is [...] Read more.
During the last ten years several studies have appeared regarding language complexity. Research on this issue began soon after the burst of a new movement of interest and research in the study of complex networks, i.e., networks whose structure is irregular, complex and dynamically evolving in time. In the first years, network approach to language mostly focused on a very abstract and general overview of language complexity, and few of them studied how this complexity is actually embodied in humans or how it affects cognition. However research has slowly shifted from the language-oriented towards a more cognitive-oriented point of view. This review first offers a brief summary on the methodological and formal foundations of complex networks, then it attempts a general vision of research activity on language from a complex networks perspective, and specially highlights those efforts with cognitive-inspired aim. Full article
(This article belongs to the Special Issue Complexity of Human Language and Cognition)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top