Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 12, Issue 1 (January 2010), Pages 1-160

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-11
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle Lorenz Curves, Size Classification, and Dimensions of Bubble Size Distributions
Entropy 2010, 12(1), 1-13; doi:10.3390/e12010001
Received: 13 October 2009 / Accepted: 11 December 2009 / Published: 25 December 2009
Cited by 2 | PDF Full-text (5023 KB) | HTML Full-text | XML Full-text
Abstract
Lorenz curves of bubble size distributions and their Gini coefficients characterize demixing processes. Through a systematic size classification, bubble size histograms are generated and investigated concerning their statistical entropy. It turns out that the temporal development of the entropy is preserved although [...] Read more.
Lorenz curves of bubble size distributions and their Gini coefficients characterize demixing processes. Through a systematic size classification, bubble size histograms are generated and investigated concerning their statistical entropy. It turns out that the temporal development of the entropy is preserved although characteristics of the histograms like number of size classes and modality are remarkably reduced. Examinations by Rényi dimensions show that the bubble size distributions are multifractal and provide information about the underlying structures like self-similarity. Full article
Open AccessArticle Estimation of Seismic Wavelets Based on the Multivariate Scale Mixture of Gaussians Model
Entropy 2010, 12(1), 14-33; doi:10.3390/e12010014
Received: 9 October 2009 / Accepted: 11 December 2009 / Published: 28 December 2009
Cited by 3 | PDF Full-text (332 KB) | HTML Full-text | XML Full-text
Abstract
This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The [...] Read more.
This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The phase of the wavelet is estimated by constant-phase rotation to the seismic signal, while the other two parameters are obtained by the Higher-order Statistics (HOS) (fourth-order cumulant) matching method. In order to derive the estimator of the Higher-order Statistics (HOS), the multivariate scale mixture of Gaussians (MSMG) model is applied to formulating the multivariate joint probability density function (PDF) of the seismic signal. By this way, we can represent HOS as a polynomial function of second-order statistics to improve the anti-noise performance and accuracy. In addition, the proposed method can work well for short time series. Full article
(This article belongs to the Special Issue Maximum Entropy)
Open AccessArticle Imprecise Shannon’s Entropy and Multi Attribute Decision Making
Entropy 2010, 12(1), 53-62; doi:10.3390/e12010053
Received: 25 September 2009 / Accepted: 16 November 2009 / Published: 5 January 2010
Cited by 28 | PDF Full-text (217 KB) | HTML Full-text | XML Full-text
Abstract
Finding the appropriate weight for each criterion is one of the main points in Multi Attribute Decision Making (MADM) problems. Shannon’s entropy method is one of the various methods for finding weights discussed in the literature. However, in many real life problems, [...] Read more.
Finding the appropriate weight for each criterion is one of the main points in Multi Attribute Decision Making (MADM) problems. Shannon’s entropy method is one of the various methods for finding weights discussed in the literature. However, in many real life problems, the data for the decision making processes cannot be measured precisely and there may be some other types of data, for instance, interval data and fuzzy data. The goal of this paper is the extension of the Shannon entropy method for the imprecise data, especially interval and fuzzy data cases. Full article
Open AccessArticle Redundancy in Systems Which Entertain a Model of Themselves: Interaction Information and the Self-Organization of Anticipation
Entropy 2010, 12(1), 63-79; doi:10.3390/e12010063
Received: 1 December 2009 / Accepted: 28 December 2009 / Published: 6 January 2010
Cited by 16 | PDF Full-text (637 KB)
Abstract
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction [...] Read more.
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information based on iterative approximation of maximum entropies. Q can then be considered as a measure of the difference between interaction information and redundancy generated in a model entertained by an observer. I argue that this provides us with a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing. The second-order system communicates meaning hyper-incursively; an observation instantiates this meaning-processing within the information processing. The net results may add to or reduce the prevailing uncertainty. The model is tested empirically for the case where textual organization can be expected to contain intellectual organization in terms of distributions of title words, author names, and cited references. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle A Dynamic Model of Information and Entropy
Entropy 2010, 12(1), 80-88; doi:10.3390/e12010080
Received: 29 October 2009 / Accepted: 14 December 2009 / Published: 7 January 2010
Cited by 2 | PDF Full-text (158 KB) | HTML Full-text | XML Full-text
Abstract
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed [...] Read more.
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system. Full article
(This article belongs to the Special Issue Information and Entropy)
Open AccessArticle From Maximum Entropy to Maximum Entropy Production: A New Approach
Entropy 2010, 12(1), 107-126; doi:10.3390/e12010107
Received: 30 November 2009 / Revised: 12 January 2010 / Accepted: 14 January 2010 / Published: 18 January 2010
Cited by 10 | PDF Full-text (380 KB) | HTML Full-text | XML Full-text
Abstract
Evidence from climate science suggests that a principle of maximum thermodynamic entropy production can be used to make predictions about some physical systems. I discuss the general form of this principle and an inherent problem with it, currently unsolved by theoretical approaches: [...] Read more.
Evidence from climate science suggests that a principle of maximum thermodynamic entropy production can be used to make predictions about some physical systems. I discuss the general form of this principle and an inherent problem with it, currently unsolved by theoretical approaches: how to determine which system it should be applied to. I suggest a new way to derive the principle from statistical mechanics, and present a tentative solution to the system boundary problem. I discuss the need for experimental validation of the principle, and its impact on the way we see the relationship between thermodynamics and kinetics. Full article
(This article belongs to the Special Issue What Is Maximum Entropy Production and How Should We Apply It?)
Figures

Open AccessArticle Arguments for the Integration of the Non-Zero-Sum Logic of Complex Animal Communication with Information Theory
Entropy 2010, 12(1), 127-135; doi:10.3390/e12010127
Received: 27 September 2009 / Revised: 28 December 2009 / Accepted: 20 January 2010 / Published: 21 January 2010
Cited by 1 | PDF Full-text (98 KB) | HTML Full-text | XML Full-text
Abstract
The outstanding levels of knowledge attained today in the research on animal communication, and the new available technologies to study visual, vocal and chemical signalling, allow an ever increasing use of information theory as a sophisticated tool to improve our knowledge of [...] Read more.
The outstanding levels of knowledge attained today in the research on animal communication, and the new available technologies to study visual, vocal and chemical signalling, allow an ever increasing use of information theory as a sophisticated tool to improve our knowledge of the complexity of animal communication. Some considerations on the way information theory and intraspecific communication can be linked are presented here. Specifically, information theory may help us to explore interindividual variations in different environmental constraints and social scenarios, as well as the communicative features of social vs. solitary species. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Open AccessArticle On the Entropy Based Associative Memory Model with Higher-Order Correlations
Entropy 2010, 12(1), 136-147; doi:10.3390/e12010136
Received: 2 January 2010 / Accepted: 18 January 2010 / Published: 22 January 2010
PDF Full-text (488 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, an entropy based associative memory model will be proposed and applied to memory retrievals with an orthogonal learning model so as to compare with the conventional model based on the quadratic Lyapunov functional to be minimized during the retrieval [...] Read more.
In this paper, an entropy based associative memory model will be proposed and applied to memory retrievals with an orthogonal learning model so as to compare with the conventional model based on the quadratic Lyapunov functional to be minimized during the retrieval process. In the present approach, the updating dynamics will be constructed on the basis of the entropy minimization strategy which may be reduced asymptotically to the above-mentioned conventional dynamics as a special case ignoring the higher-order correlations. According to the introduction of the entropy functional, one may involve higer-order correlation effects between neurons in a self-contained manner without any heuristic coupling coefficients as in the conventional manner. In fact we shall show such higher order coupling tensors are to be uniquely determined in the framework of the entropy based approach. From numerical results, it will be found that the presently proposed novel approach realizes much larger memory capacity than that of the quadratic Lyapunov functional approach, e.g., associatron. Full article
(This article belongs to the Special Issue Entropy in Model Reduction)
Open AccessArticle The Quantum-Classical Transition as an Information Flow
Entropy 2010, 12(1), 148-160; doi:10.3390/e12010148
Received: 21 October 2009 / Revised: 9 December 2009 / Accepted: 11 December 2009 / Published: 26 January 2010
Cited by 2 | PDF Full-text (620 KB) | HTML Full-text | XML Full-text
Abstract
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter [...] Read more.
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that the quantum-classical transition gets thereby described as the sign-reversal of the dominating direction of the information flow between classical and quantal variables. Full article
(This article belongs to the Special Issue Information and Entropy)

Review

Jump to: Research

Open AccessReview Data Compression Concepts and Algorithms and Their Applications to Bioinformatics
Entropy 2010, 12(1), 34-52; doi:10.3390/e12010034
Received: 4 December 2009 / Accepted: 17 December 2009 / Published: 29 December 2009
Cited by 15 | PDF Full-text (435 KB) | HTML Full-text | XML Full-text
Abstract
Data compression at its base is concerned with how information is organized in data. Understanding this organization can lead to efficient ways of representing the information and hence data compression. In this paper we review the ways in which ideas and approaches [...] Read more.
Data compression at its base is concerned with how information is organized in data. Understanding this organization can lead to efficient ways of representing the information and hence data compression. In this paper we review the ways in which ideas and approaches fundamental to the theory and practice of data compression have been used in the area of bioinformatics. We look at how basic theoretical ideas from data compression, such as the notions of entropy, mutual information, and complexity have been used for analyzing biological sequences in order to discover hidden patterns, infer phylogenetic relationships between organisms and study viral populations. Finally, we look at how inferred grammars for biological sequences have been used to uncover structure in biological sequences. Full article
Open AccessReview Maximum Entropy Approaches to Living Neural Networks
Entropy 2010, 12(1), 89-106; doi:10.3390/e12010089
Received: 11 December 2009 / Revised: 6 January 2010 / Accepted: 11 January 2010 / Published: 13 January 2010
Cited by 8 | PDF Full-text (284 KB) | HTML Full-text | XML Full-text
Abstract
Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the [...] Read more.
Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the key developments driving this interest is a new class of models based on the principle of maximum entropy. Maximum entropy models have been reported to account for spatial correlation structure in ensembles of neurons recorded from several different types of data. Importantly, these models require only information about the firing rates of individual neurons and their pairwise correlations. If this approach is generally applicable, it would drastically simplify the problem of understanding how neural networks behave. Given the interest in this method, several groups now have worked to extend maximum entropy models to account for temporal correlations. Here, we review how maximum entropy models have been applied to neuronal ensemble data to account for spatial and temporal correlations. We also discuss criticisms of the maximum entropy approach that argue that it is not generally applicable to larger ensembles of neurons. We conclude that future maximum entropy models will need to address three issues: temporal correlations, higher-order correlations, and larger ensemble sizes. Finally, we provide a brief list of topics for future research. Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top