Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 12, Issue 2 (February 2010), Pages 161-288

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Jump to: Other

Open AccessArticle Rehabilitating Information
Entropy 2010, 12(2), 164-196; doi:10.3390/e12020164
Received: 21 December 2009 / Revised: 26 January 2010 / Accepted: 28 January 2010 / Published: 3 February 2010
Cited by 2 | PDF Full-text (134 KB) | HTML Full-text | XML Full-text | Correction | Supplementary Files
Abstract
In an early paper on logic, C.S. Peirce defined a concept of ‘information’ very different from the later conceptions which gave rise to ‘information science’, and indirectly to current problems such as an overload of ‘useless information’. A study of further developments in
[...] Read more.
In an early paper on logic, C.S. Peirce defined a concept of ‘information’ very different from the later conceptions which gave rise to ‘information science’, and indirectly to current problems such as an overload of ‘useless information’. A study of further developments in Peircean semiotics, and in related conceptual frameworks including the cybernetics of Bateson and the cybersemiotics of Brier, reveals deep relations between Peirce's concept of information and the irreducibly triadic nature of signs. Since all sciences, indeed all cognition and communication, are semiotic processes, the core semiotic principle implicit in the Peircean concept may clarify how our uses of language and other symbolic media can actually inform–and thus transform–the way we humans inhabit the biosphere. Full article
Open AccessArticle Entropy, Function and Evolution: Naturalizing Peircian Semiosis
Entropy 2010, 12(2), 197-242; doi:10.3390/e12020197
Received: 9 December 2009 / Accepted: 18 January 2010 / Published: 4 February 2010
Cited by 8 | PDF Full-text (720 KB) | HTML Full-text | XML Full-text
Abstract
In the biosemiotic literature there is a tension between the naturalistic reference to biological processes and the category of ‘meaning’ which is central in the concept of semiosis. A crucial term bridging the two dimensions is ‘information’. I argue that the tension can
[...] Read more.
In the biosemiotic literature there is a tension between the naturalistic reference to biological processes and the category of ‘meaning’ which is central in the concept of semiosis. A crucial term bridging the two dimensions is ‘information’. I argue that the tension can be resolved if we reconsider the relation between information and entropy and downgrade the conceptual centrality of Shannon information in the standard approach to entropy and information. Entropy comes into full play if semiosis is seen as a physical process involving causal interactions between physical systems with functions. Functions emerge from evolutionary processes, as conceived in recent philosophical contributions to teleosemantics. In this context, causal interactions can be interpreted in a dual mode, namely as standard causation and as an observation. Thus, a function appears to be the interpretant in the Peircian triadic notion of the sign. Recognizing this duality, the Gibbs/Jaynes notion of entropy is added to the picture, which shares an essential conceptual feature with the notion of function: Both concepts are part of a physicalist ontology, but are observer relative at the same time. Thus, it is possible to give an account of semiosis within the entropy framework without limiting the notion of entropy to the Shannon measure, but taking full account of the thermodynamic definition. A central feature of this approach is the conceptual linkage between the evolution of functions and maximum entropy production. I show how we can conceive of the semiosphere as a fundamental physical phenomenon. Following an early contribution by Hayek, in conclusion I argue that the category of ‘meaning’ supervenes on nested functions in semiosis, and has a function itself, namely to enable functional self-reference, which otherwise mainfests functional break-down because of standard set-theoretic paradoxes. Full article
Open AccessArticle Improvement of Energy Conversion/Utilization by Exergy Analysis: Selected Cases for Non-Reactive and Reactive Systems
Entropy 2010, 12(2), 243-261; doi:10.3390/e12020243
Received: 29 December 2009 / Revised: 2 February 2010 / Accepted: 4 February 2010 / Published: 5 February 2010
Cited by 4 | PDF Full-text (1411 KB) | HTML Full-text | XML Full-text
Abstract
Exergy analysis is a powerful and systematic tool for the improvement of energy systems, with many possible applications in both conversion and utilization of energy. Here we present selected applications, with a special attention to renewable energy systems (solar), covering both design and
[...] Read more.
Exergy analysis is a powerful and systematic tool for the improvement of energy systems, with many possible applications in both conversion and utilization of energy. Here we present selected applications, with a special attention to renewable energy systems (solar), covering both design and operation/control. After these applications to non-reactive systems, potential ways of reducing the large irreversibilities connected to reactive systems (combustion) are considered, with special reference to chemically-recuperated gas turbine cycles and topping high-temperature fuel cells. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Open AccessArticle Entropy and Divergence Associated with Power Function and the Statistical Application
Entropy 2010, 12(2), 262-274; doi:10.3390/e12020262
Received: 29 December 2009 / Revised: 20 February 2010 / Accepted: 23 February 2010 / Published: 25 February 2010
Cited by 12 | PDF Full-text (133 KB) | HTML Full-text | XML Full-text
Abstract
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation
[...] Read more.
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Self-Organization during Friction in Complex Surface Engineered Tribosystems
Entropy 2010, 12(2), 275-288; doi:10.3390/e12020275
Received: 15 October 2009 / Revised: 5 January 2010 / Accepted: 23 February 2010 / Published: 25 February 2010
Cited by 18 | PDF Full-text (523 KB) | HTML Full-text | XML Full-text
Abstract
Self-organization during friction in complex surface engineered tribosystems is investigated. The probability of self-organization in these complex tribosystems is studied on the basis of the theoretical concepts of irreversible thermodynamics. It is shown that a higher number of interrelated processes within the system
[...] Read more.
Self-organization during friction in complex surface engineered tribosystems is investigated. The probability of self-organization in these complex tribosystems is studied on the basis of the theoretical concepts of irreversible thermodynamics. It is shown that a higher number of interrelated processes within the system result in an increased probability of self-organization. The results of this thermodynamic model are confirmed by the investigation of the wear performance of a novel Ti0.2Al0.55Cr0.2Si0.03Y0.02N/Ti0.25Al0.65Cr0.1N (PVD) coating with complex nano-multilayered structure under extreme tribological conditions of dry high-speed end milling of hardened H13 tool steel. Full article
(This article belongs to the Special Issue Entropy and Friction Volume 2)

Other

Jump to: Research

Open AccessLetter Quantifying Information Content in Survey Data by Entropy
Entropy 2010, 12(2), 161-163; doi:10.3390/e12020161
Received: 4 November 2009 / Revised: 6 January 2010 / Accepted: 26 January 2010 / Published: 28 January 2010
Cited by 2 | PDF Full-text (34 KB) | HTML Full-text | XML Full-text
Abstract
We apply Shannon entropy as a measure of information content in survey data, and define information efficiency as the empirical entropy divided by the maximum attainable entropy. In a case study of the Norwegian Function Assessment Scale, entropy calculations show that the 5-point
[...] Read more.
We apply Shannon entropy as a measure of information content in survey data, and define information efficiency as the empirical entropy divided by the maximum attainable entropy. In a case study of the Norwegian Function Assessment Scale, entropy calculations show that the 5-point response version has higher information efficiency than the 4-point version. Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top