entropy-logo

Journal Browser

Journal Browser

Advances in Information Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 October 2013) | Viewed by 132494

Special Issue Editor


E-Mail Website
Guest Editor
Department of Pure and Applied Mathematics, University of Padova, Via Belzoni 7, 35131 Padova, Italy
Interests: quantum gravity; quantum cosmology; quantum information
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Entropy is a key concept in information theory. We decide to expand Entropy journal to cover all areas of information theory, including the applications. In future, a section "Information Theory" will be set up.

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (17 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

1220 KiB  
Article
Measuring Instantaneous and Spectral Information Entropies by Shannon Entropy of Choi-Williams Distribution in the Context of Electroencephalography
by Umberto Melia, Francesc Claria, Montserrat Vallverdu and Pere Caminal
Entropy 2014, 16(5), 2530-2548; https://doi.org/10.3390/e16052530 - 09 May 2014
Cited by 10 | Viewed by 7742
Abstract
The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial instantaneous entropy, calculated as the [...] Read more.
The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function at each time instant taken independently; (2) partial spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of each frequency value taken independently; (3) complete instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function of the entire CWD; (4) complete spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of the entire CWD. These indexes were tested on synthetic time series with different behavior (periodic, chaotic and random) and on a dataset of electroencephalographic (EEG) signals recorded in different states (eyes-open, eyes-closed, ictal and non-ictal activity). The results have shown that the values of these indexes tend to decrease, with different proportion, when the behavior of the synthetic signals evolved from chaos or randomness to periodicity. Statistical differences (p-value < 0.0005) were found between values of these measures comparing eyes-open and eyes-closed states and between ictal and non-ictal states in the traditional EEG frequency bands. Finally, this paper has demonstrated that the proposed measures can be useful tools to quantify the different periodic, chaotic and random components in EEG signals. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

1706 KiB  
Article
Cross Layer Interference Management in Wireless Biomedical Networks
by Emmanouil G. Spanakis, Vangelis Sakkalis, Kostas Marias and Apostolos Traganitis
Entropy 2014, 16(4), 2085-2104; https://doi.org/10.3390/e16042085 - 14 Apr 2014
Cited by 9 | Viewed by 6815
Abstract
Interference, in wireless networks, is a central phenomenon when multiple uncoordinated links share a common communication medium. The study of the interference channel was initiated by Shannon in 1961 and since then this problem has been thoroughly elaborated at the Information theoretic level [...] Read more.
Interference, in wireless networks, is a central phenomenon when multiple uncoordinated links share a common communication medium. The study of the interference channel was initiated by Shannon in 1961 and since then this problem has been thoroughly elaborated at the Information theoretic level but its characterization still remains an open issue. When multiple uncoordinated links share a common medium the effect of interference is a crucial limiting factor for network performance. In this work, using cross layer cooperative communication techniques, we study how to compensate interference in the context of wireless biomedical networks, where many links transferring biomedical or other health related data may be formed and suffer from all other interfering transmissions, to allow successful receptions and improve the overall network performance. We define the interference limited communication range to be the critical communication region around a receiver, with a number of surrounding interfering nodes, within which a successful communication link can be formed. Our results indicate that we can achieve more successful transmissions by adapting the transmission rate and power, to the path loss exponent, and the selected mode of the underline communication technique allowing interference mitigation and when possible lower power consumption and increase achievable transmission rates. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

101 KiB  
Article
Localization of Discrete Time Quantum Walks on the Glued Trees
by Yusuke Ide, Norio Konno, Etsuo Segawa and Xin-Ping Xu
Entropy 2014, 16(3), 1501-1514; https://doi.org/10.3390/e16031501 - 18 Mar 2014
Cited by 9 | Viewed by 6360
Abstract
In this paper, we consider the time averaged distribution of discrete time quantum walks on the glued trees. In order to analyze the walks on the glued trees, we consider a reduction to the walks on path graphs. Using a spectral analysis of [...] Read more.
In this paper, we consider the time averaged distribution of discrete time quantum walks on the glued trees. In order to analyze the walks on the glued trees, we consider a reduction to the walks on path graphs. Using a spectral analysis of the Jacobi matrices defined by the corresponding random walks on the path graphs, we have a spectral decomposition of the time evolution operator of the quantum walks. We find significant contributions of the eigenvalues, ±1, of the Jacobi matrices to the time averaged limit distribution of the quantum walks. As a consequence, we obtain the lower bounds of the time averaged distribution. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

338 KiB  
Article
The Maximum Error Probability Criterion, Random Encoder, and Feedback, in Multiple Input Channels
by Ning Cai
Entropy 2014, 16(3), 1211-1242; https://doi.org/10.3390/e16031211 - 25 Feb 2014
Cited by 5 | Viewed by 6304
Abstract
For a multiple input channel, one may define different capacity regions, according to the criterions of error, types of codes, and presence of feedback. In this paper, we aim to draw a complete picture of relations among these different capacity regions. [...] Read more.
For a multiple input channel, one may define different capacity regions, according to the criterions of error, types of codes, and presence of feedback. In this paper, we aim to draw a complete picture of relations among these different capacity regions. To this end, we first prove that the average-error-probability capacity region of a multiple input channel can be achieved by a random code under the criterion of maximum error probability. Moreover, we show that for a non-deterministic multiple input channel with feedback, the capacity regions are the same under two different error criterions. In addition, we discuss two special classes of channels to shed light on the relation of different capacity regions. In particular, to illustrate the roles of feedback, we provide a class of MAC, for which feedback may enlarge maximum-error-probability capacity regions, but not average-error-capacity regions. Besides, we present a class of MAC, as an example for which the maximum-error-probability capacity regions are strictly smaller than the average-error-probability capacity regions (first example showing this was due to G. Dueck). Differently from G. Dueck’s enlightening example in which a deterministic MAC was considered, our example includes and further generalizes G. Dueck’s example by taking both deterministic and non-deterministic MACs into account. Finally, we extend our results for a discrete memoryless two-input channel, to compound, arbitrarily varying MAC, and MAC with more than two inputs. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

2324 KiB  
Article
Prediction Method for Image Coding Quality Based on Differential Information Entropy
by Xin Tian, Tao Li, Jin-Wen Tian and Song Li
Entropy 2014, 16(2), 990-1001; https://doi.org/10.3390/e16020990 - 17 Feb 2014
Cited by 3 | Viewed by 5666
Abstract
For the requirement of quality-based image coding, an approach to predict the quality of image coding based on differential information entropy is proposed. First of all, some typical prediction approaches are introduced, and then the differential information entropy is reviewed. Taking JPEG2000 as [...] Read more.
For the requirement of quality-based image coding, an approach to predict the quality of image coding based on differential information entropy is proposed. First of all, some typical prediction approaches are introduced, and then the differential information entropy is reviewed. Taking JPEG2000 as an example, the relationship between differential information entropy and the objective assessment indicator PSNR at a fixed compression ratio is established via data fitting, and the constraint for fitting is to minimize the average error. Next, the relationship among differential information entropy, compression ratio and PSNR at various compression ratios is constructed and this relationship is used as an indicator to predict the image coding quality. Finally, the proposed approach is compared with some traditional approaches. From the experiments, it can be seen that the differential information entropy has a better linear relationship with image coding quality than that with the image activity. Therefore, the conclusion can be reached that the proposed approach is capable of predicting image coding quality at low compression ratios with small errors, and can be widely applied in a variety of real-time space image coding systems for its simplicity. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

525 KiB  
Article
New Methods of Entropy-Robust Estimation for Randomized Models under Limited Data
by Yuri Popkov and Alexey Popkov
Entropy 2014, 16(2), 675-698; https://doi.org/10.3390/e16020675 - 23 Jan 2014
Cited by 19 | Viewed by 5546
Abstract
The paper presents a new approach to restoration characteristics randomized models under small amounts of input and output data. This approach proceeds from involving randomized static and dynamic models and estimating the probabilistic characteristics of their parameters. We consider static and dynamic models [...] Read more.
The paper presents a new approach to restoration characteristics randomized models under small amounts of input and output data. This approach proceeds from involving randomized static and dynamic models and estimating the probabilistic characteristics of their parameters. We consider static and dynamic models described by Volterra polynomials. The procedures of robust parametric and non-parametric estimation are constructed by exploiting the entropy concept based on the generalized informational Boltzmann’s and Fermi’s entropies. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

514 KiB  
Article
Some Convex Functions Based Measures of Independence and Their Application to Strange Attractor Reconstruction
by Yang Chen and Kazuyuki Aihara
Entropy 2011, 13(4), 820-840; https://doi.org/10.3390/e13040820 - 08 Apr 2011
Viewed by 7790
Abstract
The classical information-theoretic measures such as the entropy and the mutual information (MI) are widely applicable to many areas in science and engineering. Csiszar generalized the entropy and the MI by using the convex functions. Recently, we proposed the grid occupancy (GO) and [...] Read more.
The classical information-theoretic measures such as the entropy and the mutual information (MI) are widely applicable to many areas in science and engineering. Csiszar generalized the entropy and the MI by using the convex functions. Recently, we proposed the grid occupancy (GO) and the quasientropy (QE) as measures of independence. The QE explicitly includes a convex function in its definition, while the expectation of GO is a subclass of QE. In this paper, we study the effect of different convex functions on GO, QE, and Csiszar’s generalized mutual information (GMI). A quality factor (QF) is proposed to quantify the sharpness of their minima. Using the QF, it is shown that these measures can have sharper minima than the classical MI. Besides, a recursive algorithm for computing GMI, which is a generalization of Fraser and Swinney’s algorithm for computing MI, is proposed. Moreover, we apply GO, QE, and GMI to chaotic time series analysis. It is shown that these measures are good criteria for determining the optimum delay in strange attractor reconstruction. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

261 KiB  
Article
Analysis of Resource and Emission Impacts: An Emergy-Based Multiple Spatial Scale Framework for Urban Ecological and Economic Evaluation
by Gengyuan Liu, Zhifeng Yang, Bin Chen and Lixiao Zhang
Entropy 2011, 13(3), 720-743; https://doi.org/10.3390/e13030720 - 23 Mar 2011
Cited by 23 | Viewed by 9180
Abstract
The development of the complex and multi-dimensional urban socio-economic system creates impacts on natural capital and human capital, which range from a local to a global scale. An emergy-based multiple spatial scale analysis framework and a rigorous accounting method that can quantify the [...] Read more.
The development of the complex and multi-dimensional urban socio-economic system creates impacts on natural capital and human capital, which range from a local to a global scale. An emergy-based multiple spatial scale analysis framework and a rigorous accounting method that can quantify the values of human-made and natural capital losses were proposed in this study. With the intent of comparing the trajectory of Beijing over time, the characteristics of the interface between different scales are considered to explain the resource trade and the impacts of emissions. In addition, our improved determination of emergy analysis and acceptable management options that are in agreement with Beijing’s overall sustainability strategy were examined. The results showed that Beijing’s economy was closely correlated with the consumption of nonrenewable resources and exerted rising pressure on the environment. Of the total emergy use by the economic system, the imported nonrenewable resources from other provinces contribute the most, and the multi‑scale environmental impacts of waterborne and airborne pollution continued to increase from 1999 to 2006. Given the inputs structure, Beijing was chiefly making greater profits by shifting resources from other provinces in China and transferring the emissions outside. The results of our study should enable urban policy planners to better understand the multi-scale policy planning and development design of an urban ecological economic system. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

1206 KiB  
Article
Information Theory and Dynamical System Predictability
by Richard Kleeman
Entropy 2011, 13(3), 612-649; https://doi.org/10.3390/e13030612 - 07 Mar 2011
Cited by 58 | Viewed by 10527
Abstract
Predicting the future state of a turbulent dynamical system such as the atmosphere has been recognized for several decades to be an essentially statistical undertaking. Uncertainties from a variety of sources are magnified by dynamical mechanisms and given sufficient time, compromise any prediction. [...] Read more.
Predicting the future state of a turbulent dynamical system such as the atmosphere has been recognized for several decades to be an essentially statistical undertaking. Uncertainties from a variety of sources are magnified by dynamical mechanisms and given sufficient time, compromise any prediction. In the last decade or so this process of uncertainty evolution has been studied using a variety of tools from information theory. These provide both a conceptually general view of the problem as well as a way of probing its non-linearity. Here we review these advances from both a theoretical and practical perspective. Connections with other theoretical areas such as statistical mechanics are emphasized. The importance of obtaining practical results for prediction also guides the development presented. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Graphical abstract

143 KiB  
Article
Estimating Neuronal Information: Logarithmic Binning of Neuronal Inter-Spike Intervals
by Alan D. Dorval
Entropy 2011, 13(2), 485-501; https://doi.org/10.3390/e13020485 - 10 Feb 2011
Cited by 15 | Viewed by 9618
Abstract
Neurons communicate via the relative timing of all-or-none biophysical signals called spikes. For statistical analysis, the time between spikes can be accumulated into inter-spike interval histograms. Information theoretic measures have been estimated from these histograms to assess how information varies across organisms, neural [...] Read more.
Neurons communicate via the relative timing of all-or-none biophysical signals called spikes. For statistical analysis, the time between spikes can be accumulated into inter-spike interval histograms. Information theoretic measures have been estimated from these histograms to assess how information varies across organisms, neural systems, and disease conditions. Because neurons are computational units that, to the extent they process time, work not by discrete clock ticks but by the exponential decays of numerous intrinsic variables, we propose that neuronal information measures scale more naturally with the logarithm of time. For the types of inter-spike interval distributions that best describe neuronal activity, the logarithm of time enables fewer bins to capture the salient features of the distributions. Thus, discretizing the logarithm of inter-spike intervals, as compared to the inter-spike intervals themselves, yields histograms that enable more accurate entropy and information estimates for fewer bins and less data. Additionally, as distribution parameters vary, the entropy and information calculated from the logarithm of the inter-spike intervals are substantially better behaved, e.g., entropy is independent of mean rate, and information is equally affected by rate gains and divisions. Thus, when compiling neuronal data for subsequent information analysis, the logarithm of the inter-spike intervals is preferred, over the untransformed inter-spike intervals, because it yields better information estimates and is likely more similar to the construction used by nature herself. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

298 KiB  
Article
Information Theoretic Hierarchical Clustering
by Mehdi Aghagolzadeh, Hamid Soltanian-Zadeh and Babak Nadjar Araabi
Entropy 2011, 13(2), 450-465; https://doi.org/10.3390/e13020450 - 10 Feb 2011
Cited by 12 | Viewed by 8057
Abstract
Hierarchical clustering has been extensively used in practice, where clusters can be assigned and analyzed simultaneously, especially when estimating the number of clusters is challenging. However, due to the conventional proximity measures recruited in these algorithms, they are only capable of detecting mass-shape [...] Read more.
Hierarchical clustering has been extensively used in practice, where clusters can be assigned and analyzed simultaneously, especially when estimating the number of clusters is challenging. However, due to the conventional proximity measures recruited in these algorithms, they are only capable of detecting mass-shape clusters and encounter problems in identifying complex data structures. Here, we introduce two bottom-up hierarchical approaches that exploit an information theoretic proximity measure to explore the nonlinear boundaries between clusters and extract data structures further than the second order statistics. Experimental results on both artificial and real datasets demonstrate the superiority of the proposed algorithm compared to conventional and information theoretic clustering algorithms reported in the literature, especially in detecting the true number of clusters. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

261 KiB  
Article
Information Theory in Scientific Visualization
by Chaoli Wang and Han-Wei Shen
Entropy 2011, 13(1), 254-273; https://doi.org/10.3390/e13010254 - 21 Jan 2011
Cited by 93 | Viewed by 12114
Abstract
In recent years, there is an emerging direction that leverages information theory to solve many challenging problems in scientific data analysis and visualization. In this article, we review the key concepts in information theory, discuss how the principles of information theory can be [...] Read more.
In recent years, there is an emerging direction that leverages information theory to solve many challenging problems in scientific data analysis and visualization. In this article, we review the key concepts in information theory, discuss how the principles of information theory can be useful for visualization, and provide specific examples to draw connections between data communication and data visualization in terms of how information can be measured quantitatively. As the amount of digital data available to us increases at an astounding speed, the goal of this article is to introduce the interested readers to this new direction of data analysis research, and to inspire them to identify new applications and seek solutions using information theory. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Graphical abstract

242 KiB  
Article
Information Storage in Liquids with Ordered Molecular Assemblies
by Meir Shinitzky
Entropy 2011, 13(1), 1-10; https://doi.org/10.3390/e13010001 - 23 Dec 2010
Viewed by 6189
Abstract
In some unique cases, liquids can divert from pure isotropy due to the formation of ordered molecular assemblies with acquired “negative entropy” and information storage. The energy stored in such ordered domains can be combined with an independent quantitative parameter related to the [...] Read more.
In some unique cases, liquids can divert from pure isotropy due to the formation of ordered molecular assemblies with acquired “negative entropy” and information storage. The energy stored in such ordered domains can be combined with an independent quantitative parameter related to the degree of order, which can then translate the dormant information to the quantitative energetic term “information capacity”. Information storage in liquids can be thus expressed in absolute energy units. Three liquid systems are analyzed in some detail. The first is a solution of a chiral substance, e.g., amino acid in water, where the degree of optical rotation provides the measure for order while the heat liberated upon racemization is the energy corresponding to the negative entropy. The second is a neat chiral fluid, e.g., 2-butanol, complying with the same parameters as those of chiral solutions. The third is electronically excited fluorescent solute, where the shift in the emission spectrum corresponds to the energy acquired by the transiently oriented solvent envelopes. Other, yet unexplored, possibilities are also suggested. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

229 KiB  
Article
Incorporating Spatial Structures in Ecological Inference: An Information Theory Approach
by Rosa Bernardini Papalia
Entropy 2010, 12(10), 2171-2185; https://doi.org/10.3390/e12102171 - 14 Oct 2010
Cited by 6 | Viewed by 5891
Abstract
This paper introduces an Information Theory-based method for modeling economic aggregates and estimating their sub-group (sub-area) decomposition when no individual or sub-group data are available. This method offers a flexible framework for modeling the underlying variation in sub-group indicators, by addressing the spatial [...] Read more.
This paper introduces an Information Theory-based method for modeling economic aggregates and estimating their sub-group (sub-area) decomposition when no individual or sub-group data are available. This method offers a flexible framework for modeling the underlying variation in sub-group indicators, by addressing the spatial dependency problem. A basic ecological inference problem, which allows for spatial heterogeneity and dependence, is presented with the aim of first estimating the model at the aggregate level, and then of employing the estimated coefficients to obtain the sub-group level indicators. Full article
(This article belongs to the Special Issue Advances in Information Theory)
853 KiB  
Article
Increasing and Decreasing Returns and Losses in Mutual Information Feature Subset Selection
by Gert Van Dijck and Marc M. Van Hulle
Entropy 2010, 12(10), 2144-2170; https://doi.org/10.3390/e12102144 - 11 Oct 2010
Cited by 6 | Viewed by 8579
Abstract
Mutual information between a target variable and a feature subset is extensively used as a feature subset selection criterion. This work contributes to a more thorough understanding of the evolution of the mutual information as a function of the number of features selected. [...] Read more.
Mutual information between a target variable and a feature subset is extensively used as a feature subset selection criterion. This work contributes to a more thorough understanding of the evolution of the mutual information as a function of the number of features selected. We describe decreasing returns and increasing returns behavior in sequential forward search and increasing losses and decreasing losses behavior in sequential backward search. We derive conditions under which the decreasing returns and the increasing losses behavior hold and prove the occurrence of this behavior in some Bayesian networks. The decreasing returns behavior implies that the mutual information is concave as a function of the number of features selected, whereas the increasing returns behavior implies this function is convex. The increasing returns and decreasing losses behavior are proven to occur in an XOR hypercube. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

Review

Jump to: Research

220 KiB  
Review
On a Connection between Information and Group Lattices
by Hua Li and Edwin K. P. Chong
Entropy 2011, 13(3), 683-708; https://doi.org/10.3390/e13030683 - 18 Mar 2011
Cited by 14 | Viewed by 7852
Abstract
In this paper we review a particular connection between information theory and group theory. We formalize the notions of information elements and information lattices, first proposed by Shannon. Exploiting this formalization, we expose a comprehensive parallelism between information lattices and subgroup lattices. Qualitatively, [...] Read more.
In this paper we review a particular connection between information theory and group theory. We formalize the notions of information elements and information lattices, first proposed by Shannon. Exploiting this formalization, we expose a comprehensive parallelism between information lattices and subgroup lattices. Qualitatively, isomorphisms between information lattices and subgroup lattices are demonstrated. Quantitatively, a decisive approximation relation between the entropy structures of information lattices and the log-index structures of the corresponding subgroup lattices, first discovered by Chan and Yeung, is highlighted. This approximation, addressing both joint and common entropies, extends the work of Chan and Yeung on joint entropy. A consequence of this approximation result is that any continuous law holds in general for the entropies of information elements if and only if the same law holds in general for the log-indices of subgroups. As an application, by constructing subgroup counterexamples, we find surprisingly that common information, unlike joint information, obeys neither the submodularity nor the supermodularity law. We emphasize that the notion of information elements is conceptually significant—formalizing it helps to reveal the deep connection between information theory and group theory. The parallelism established in this paper admits an appealing group-action explanation and provides useful insights into the intrinsic structure among information elements from a group-theoretic perspective. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

113 KiB  
Review
Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations
by Silvana Flego, Felipe Olivares, Angelo Plastino and Montserrat Casas
Entropy 2011, 13(1), 184-194; https://doi.org/10.3390/e13010184 - 14 Jan 2011
Cited by 8 | Viewed by 7372
Abstract
In employing MaxEnt, a crucial role is assigned to the reciprocity relations that relate the quantifier to be extremized (Shannon’s entropy S), the Lagrange multipliers that arise during the variational process, and the expectation values that constitute the a priori input information. We [...] Read more.
In employing MaxEnt, a crucial role is assigned to the reciprocity relations that relate the quantifier to be extremized (Shannon’s entropy S), the Lagrange multipliers that arise during the variational process, and the expectation values that constitute the a priori input information. We review here just how these ingredients relate to each other when the information quantifier S is replaced by Fisher’s information measure I. The connection of these proceedings with thermodynamics constitute our physical background. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Back to TopTop