Next Issue
Volume 12, June
Previous Issue
Volume 12, April
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 12, Issue 5 (May 2010) – 12 articles , Pages 996-1324

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
990 KiB  
Article
Entropy and Phase Coexistence in Clusters: Metals vs. Nonmetals
by Richard Stephen Berry and Boris Michailovich Smirnov
Entropy 2010, 12(5), 1303-1324; https://doi.org/10.3390/e12051303 - 25 May 2010
Cited by 9 | Viewed by 7825
Abstract
Small clusters violate the Gibbs phase rule by exhibiting two or more phases in thermodynamic equilibrium over bands of temperature and pressure. The reason is the small number of particles comprising each system. We review recent results concerning the size ranges for which [...] Read more.
Small clusters violate the Gibbs phase rule by exhibiting two or more phases in thermodynamic equilibrium over bands of temperature and pressure. The reason is the small number of particles comprising each system. We review recent results concerning the size ranges for which this behavior is observable. The principal characteristic determining the coexistence range is the transitions entropy change. We review how this happens, using simulations of 13-atom Lennard-Jones and metal clusters to compare dielectric clusters with the more complex clusters of metal atoms. The dominating difference between the narrower coexistence bands of dielectrics and the wider bands of metal clusters is the much higher configurational entropy of the liquid metal clusters. Full article
(This article belongs to the Special Issue Configurational Entropy)
Show Figures

Figure 1

1439 KiB  
Review
Semantic Networks: Structure and Dynamics
by Javier Borge-Holthoefer and Alex Arenas
Entropy 2010, 12(5), 1264-1302; https://doi.org/10.3390/e12051264 - 14 May 2010
Cited by 164 | Viewed by 17734
Abstract
During the last ten years several studies have appeared regarding language complexity. Research on this issue began soon after the burst of a new movement of interest and research in the study of complex networks, i.e., networks whose structure is irregular, [...] Read more.
During the last ten years several studies have appeared regarding language complexity. Research on this issue began soon after the burst of a new movement of interest and research in the study of complex networks, i.e., networks whose structure is irregular, complex and dynamically evolving in time. In the first years, network approach to language mostly focused on a very abstract and general overview of language complexity, and few of them studied how this complexity is actually embodied in humans or how it affects cognition. However research has slowly shifted from the language-oriented towards a more cognitive-oriented point of view. This review first offers a brief summary on the methodological and formal foundations of complex networks, then it attempts a general vision of research activity on language from a complex networks perspective, and specially highlights those efforts with cognitive-inspired aim. Full article
(This article belongs to the Special Issue Complexity of Human Language and Cognition)
Show Figures

Figure 1

182 KiB  
Review
Black Hole Entropy in Scalar-Tensor and ƒ(R) Gravity: An Overview
by Valerio Faraoni
Entropy 2010, 12(5), 1246-1263; https://doi.org/10.3390/e12051246 - 14 May 2010
Cited by 67 | Viewed by 7164
Abstract
A short overview of black hole entropy in alternative gravitational theories is presented. Motivated by the recent attempts to explain the cosmic acceleration without dark energy, we focus on metric and Palatini ƒ(R) gravity and on scalar-tensor theories. Full article
(This article belongs to the Special Issue Entropy in Quantum Gravity)
525 KiB  
Review
Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics
by Masanori Ohya and Noboru Watanabe
Entropy 2010, 12(5), 1194-1245; https://doi.org/10.3390/e12051194 - 07 May 2010
Cited by 29 | Viewed by 8483
Abstract
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum [...] Read more.
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60). Full article
(This article belongs to the Special Issue Information and Entropy)
539 KiB  
Article
Entropy: The Markov Ordering Approach
by Alexander N. Gorban, Pavel A. Gorban and George Judge
Entropy 2010, 12(5), 1145-1193; https://doi.org/10.3390/e12051145 - 07 May 2010
Cited by 66 | Viewed by 13878
Abstract
The focus of this article is on entropy and Markov processes. We study the properties of functionals which are invariant with respect to monotonic transformations and analyze two invariant “additivity” properties: (i) existence of a monotonic transformation which makes the functional additive with [...] Read more.
The focus of this article is on entropy and Markov processes. We study the properties of functionals which are invariant with respect to monotonic transformations and analyze two invariant “additivity” properties: (i) existence of a monotonic transformation which makes the functional additive with respect to the joining of independent systems and (ii) existence of a monotonic transformation which makes the functional additive with respect to the partitioning of the space of states. All Lyapunov functionals for Markov chains which have properties (i) and (ii) are derived. We describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). The solution differs significantly from the ordering given by the inequality of entropy growth. For inference, this approach results in a convex compact set of conditionally “most random” distributions. Full article
(This article belongs to the Special Issue Entropy in Model Reduction)
Show Figures

Graphical abstract

372 KiB  
Article
Nearest Neighbor Estimates of Entropy for Multivariate Circular Distributions
by Neeraj Misra, Harshinder Singh and Vladimir Hnizdo
Entropy 2010, 12(5), 1125-1144; https://doi.org/10.3390/e12051125 - 06 May 2010
Cited by 15 | Viewed by 8189
Abstract
In molecular sciences, the estimation of entropies of molecules is important for the understanding of many chemical and biological processes. Motivated by these applications, we consider the problem of estimating the entropies of circular random vectors and introduce non-parametric estimators based on circular [...] Read more.
In molecular sciences, the estimation of entropies of molecules is important for the understanding of many chemical and biological processes. Motivated by these applications, we consider the problem of estimating the entropies of circular random vectors and introduce non-parametric estimators based on circular distances between n sample points and their k th nearest neighbors (NN), where k (≤ n – 1) is a fixed positive integer. The proposed NN estimators are based on two different circular distances, and are proven to be asymptotically unbiased and consistent. The performance of one of the circular-distance estimators is investigated and compared with that of the already established Euclidean-distance NN estimator using Monte Carlo samples from an analytic distribution of six circular variables of an exactly known entropy and a large sample of seven internal-rotation angles in the molecule of tartaric acid, obtained by a realistic molecular-dynamics simulation. Full article
(This article belongs to the Special Issue Configurational Entropy)
Show Figures

Figure 1

329 KiB  
Article
Learning Genetic Population Structures Using Minimization of Stochastic Complexity
by Jukka Corander, Mats Gyllenberg and Timo Koski
Entropy 2010, 12(5), 1102-1124; https://doi.org/10.3390/e12051102 - 05 May 2010
Cited by 11 | Viewed by 6776
Abstract
Considerable research efforts have been devoted to probabilistic modeling of genetic population structures within the past decade. In particular, a wide spectrum of Bayesian models have been proposed for unlinked molecular marker data from diploid organisms. Here we derive a theoretical framework for [...] Read more.
Considerable research efforts have been devoted to probabilistic modeling of genetic population structures within the past decade. In particular, a wide spectrum of Bayesian models have been proposed for unlinked molecular marker data from diploid organisms. Here we derive a theoretical framework for learning genetic population structure of a haploid organism from bi-allelic markers for which potential patterns of dependence are a priori unknown and to be explicitly incorporated in the model. Our framework is based on the principle of minimizing stochastic complexity of an unsupervised classification under tree augmented factorization of the predictive data distribution. We discuss a fast implementation of the learning framework using deterministic algorithms. Full article
(This article belongs to the Special Issue Entropy in Genetics and Computational Biology)
Show Figures

Figure 1

362 KiB  
Article
On the Interplay between Entropy and Robustness of Gene Regulatory Networks
by Bor-Sen Chen and Cheng-Wei Li
Entropy 2010, 12(5), 1071-1101; https://doi.org/10.3390/e12051071 - 04 May 2010
Cited by 26 | Viewed by 9081
Abstract
The interplay between entropy and robustness of gene network is a core mechanism of systems biology. The entropy is a measure of randomness or disorder of a physical system due to random parameter fluctuation and environmental noises in gene regulatory networks. The robustness [...] Read more.
The interplay between entropy and robustness of gene network is a core mechanism of systems biology. The entropy is a measure of randomness or disorder of a physical system due to random parameter fluctuation and environmental noises in gene regulatory networks. The robustness of a gene regulatory network, which can be measured as the ability to tolerate the random parameter fluctuation and to attenuate the effect of environmental noise, will be discussed from the robust H stabilization and filtering perspective. In this review, we will also discuss their balancing roles in evolution and potential applications in systems and synthetic biology. Full article
(This article belongs to the Special Issue Entropy in Genetics and Computational Biology)
Show Figures

Figure 1

183 KiB  
Article
Functional Information: Towards Synthesis of Biosemiotics and Cybernetics
by Alexei A. Sharov
Entropy 2010, 12(5), 1050-1070; https://doi.org/10.3390/e12051050 - 27 Apr 2010
Cited by 39 | Viewed by 9065
Abstract
Biosemiotics and cybernetics are closely related, yet they are separated by the boundary between life and non-life: biosemiotics is focused on living organisms, whereas cybernetics is applied mostly to non-living artificial devices. However, both classes of systems are agents that perform functions necessary [...] Read more.
Biosemiotics and cybernetics are closely related, yet they are separated by the boundary between life and non-life: biosemiotics is focused on living organisms, whereas cybernetics is applied mostly to non-living artificial devices. However, both classes of systems are agents that perform functions necessary for reaching their goals. I propose to shift the focus of biosemiotics from living organisms to agents in general, which all belong to a pragmasphere or functional universe. Agents should be considered in the context of their hierarchy and origin because their semiosis can be inherited or induced by higher-level agents. To preserve and disseminate their functions, agents use functional information - a set of signs that encode and control their functions. It includes stable memory signs, transient messengers, and natural signs. The origin and evolution of functional information is discussed in terms of transitions between vegetative, animal, and social levels of semiosis, defined by Kull. Vegetative semiosis differs substantially from higher levels of semiosis, because signs are recognized and interpreted via direct code-based matching and are not associated with ideal representations of objects. Thus, I consider a separate classification of signs at the vegetative level that includes proto-icons, proto-indexes, and proto-symbols. Animal and social semiosis are based on classification, and modeling of objects, which represent the knowledge of agents about their body (Innenwelt) and environment (Umwelt). Full article
Show Figures

Graphical abstract

612 KiB  
Review
On the Thermodynamics of Friction and Wear―A Review
by M. Amiri and Michael M. Khonsari
Entropy 2010, 12(5), 1021-1049; https://doi.org/10.3390/e12051021 - 27 Apr 2010
Cited by 166 | Viewed by 16771
Abstract
An extensive survey of the papers pertaining to the thermodynamic approach to tribosystems, particularly using the concept of entropy as a natural time base, is presented with a summary of the important contributions of leading researchers. Full article
(This article belongs to the Special Issue Entropy and Friction Volume 2)
Show Figures

Figure 1

6551 KiB  
Article
Multi-Criteria Evaluation of Energy Systems with Sustainability Considerations
by Christos A. Frangopoulos and Despoina E. Keramioti
Entropy 2010, 12(5), 1006-1020; https://doi.org/10.3390/e12051006 - 27 Apr 2010
Cited by 38 | Viewed by 9724
Abstract
A multi-criteria approach is presented for the assessment of alternative means for covering the energy needs (electricity and heat) of an industrial unit, taking into consideration sustainability aspects. The procedure is first described in general terms: proper indicators are defined; next they are [...] Read more.
A multi-criteria approach is presented for the assessment of alternative means for covering the energy needs (electricity and heat) of an industrial unit, taking into consideration sustainability aspects. The procedure is first described in general terms: proper indicators are defined; next they are grouped in order to form sub-indices, which are then used to determine the composite sustainability index. The procedure is applied for the evaluation of three alternative systems. The three systems are placed in order of preference, which depends on the criteria used. In addition to conclusions reached as a result of the particular case study, recommendations for future work are given. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

107 KiB  
Article
The Maximum Entropy Production Principle and Linear Irreversible Processes
by Paško Županović, Domagoj Kuić, Željana Bonačić Lošić, Dražen Petrov, Davor Juretić and Milan Brumen
Entropy 2010, 12(5), 996-1005; https://doi.org/10.3390/e12050996 - 27 Apr 2010
Cited by 21 | Viewed by 8689
Abstract
It is shown that Onsager’s principle of the least dissipation of energy is equivalent to the maximum entropy production principle. It is known that solutions of the linearized Boltzmann equation make extrema of entropy production. It is argued, in the case of stationary [...] Read more.
It is shown that Onsager’s principle of the least dissipation of energy is equivalent to the maximum entropy production principle. It is known that solutions of the linearized Boltzmann equation make extrema of entropy production. It is argued, in the case of stationary processes, that this extremum is a maximum rather than a minimum. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Previous Issue
Next Issue
Back to TopTop