entropy-logo

Journal Browser

Journal Browser

Transfer Entropy II

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 October 2017) | Viewed by 30506

Special Issue Editor


E-Mail Website
Guest Editor
Civil and Environmental Engineering, Georgia Institute of Technology, Atlanta, GA, USA
Interests: water resources engineering; resilient infrastructure systems; smart cities
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Transfer entropy is an extended entropy measure, built on Shannon information entropy associated with probability distributions of two variables. It has been applied to analyzing the causal and informational interactions among the variables of a complex system, hence is potentially useful not only for understanding, but also for developing models for simulating and predicting the behavior of a system. While classical techniques such as correlation analysis are widely used to identify and characterize these relationships, more advanced information based techniques, such as transfer entropy, have proven to be superior.

In this Special Issue, we would like to collect papers focusing on both the theory and applications of Entropy and Transfer Entropy. Topical areas include, but are not limited to: physics, Earth and space sciences, bio-ecosystems, climatology, engineering, social sciences, economics, astronomy, and astronomy. Of special interest are theoretical foundations of entropy and transfer entropy in the context of Bayesian probability, physical and informational implications of entropy and its transfer, and new entropy-based methods and techniques for modelling physical, chemical and biological processes on all space and time scales.

Prof. Dr. Jinfeng Wang
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

 

Keywords

  • Transfer Entropy
  • Bayesian Probability
  • Causal and Informational Relationships
  • Physical Sciences
  • Social Sciences
  • Modelling Tools
  • Simulation and Prediction

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 4193 KiB  
Article
Transfer Entropy as a Tool for Hydrodynamic Model Validation
by Alicia Sendrowski, Kazi Sadid, Ehab Meselhe, Wayne Wagner, David Mohrig and Paola Passalacqua
Entropy 2018, 20(1), 58; https://doi.org/10.3390/e20010058 - 12 Jan 2018
Cited by 17 | Viewed by 5446
Abstract
The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, [...] Read more.
The validation of numerical models is an important component of modeling to ensure reliability of model outputs under prescribed conditions. In river deltas, robust validation of models is paramount given that models are used to forecast land change and to track water, solid, and solute transport through the deltaic network. We propose using transfer entropy (TE) to validate model results. TE quantifies the information transferred between variables in terms of strength, timescale, and direction. Using water level data collected in the distributary channels and inter-channel islands of Wax Lake Delta, Louisiana, USA, along with modeled water level data generated for the same locations using Delft3D, we assess how well couplings between external drivers (river discharge, tides, wind) and modeled water levels reproduce the observed data couplings. We perform this operation through time using ten-day windows. Modeled and observed couplings compare well; their differences reflect the spatial parameterization of wind and roughness in the model, which prevents the model from capturing high frequency fluctuations of water level. The model captures couplings better in channels than on islands, suggesting that mechanisms of channel-island connectivity are not fully represented in the model. Overall, TE serves as an additional validation tool to quantify the couplings of the system of interest at multiple spatial and temporal scales. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Show Figures

Figure 1

315 KiB  
Article
Minimax Estimation of Quantum States Based on the Latent Information Priors
by Takayuki Koyama, Takeru Matsuda and Fumiyasu Komaki
Entropy 2017, 19(11), 618; https://doi.org/10.3390/e19110618 - 16 Nov 2017
Cited by 5 | Viewed by 3363
Abstract
We develop priors for Bayes estimation of quantum states that provide minimax state estimation. The relative entropy from the true density operator to a predictive density operator is adopted as a loss function. The proposed prior maximizes the conditional Holevo mutual information, and [...] Read more.
We develop priors for Bayes estimation of quantum states that provide minimax state estimation. The relative entropy from the true density operator to a predictive density operator is adopted as a loss function. The proposed prior maximizes the conditional Holevo mutual information, and it is a quantum version of the latent information prior in classical statistics. For one qubit system, we provide a class of measurements that is optimal from the viewpoint of minimax state estimation. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Show Figures

Figure 1

2347 KiB  
Article
Use of Mutual Information and Transfer Entropy to Assess Interaction between Parasympathetic and Sympathetic Activities of Nervous System from HRV
by Lianrong Zheng, Weifeng Pan, Yifan Li, Daiyi Luo, Qian Wang and Guanzheng Liu
Entropy 2017, 19(9), 489; https://doi.org/10.3390/e19090489 - 13 Sep 2017
Cited by 29 | Viewed by 4775
Abstract
Obstructive sleep apnea (OSA) is a common sleep disorder that often associates with reduced heart rate variability (HRV) indicating autonomic dysfunction. HRV is mainly composed of high frequency components attributed to parasympathetic activity and low frequency components attributed to sympathetic activity. Although, time [...] Read more.
Obstructive sleep apnea (OSA) is a common sleep disorder that often associates with reduced heart rate variability (HRV) indicating autonomic dysfunction. HRV is mainly composed of high frequency components attributed to parasympathetic activity and low frequency components attributed to sympathetic activity. Although, time domain and frequency domain features of HRV have been used to sleep studies, the complex interaction between nonlinear independent frequency components with OSA is less known. This study included 30 electrocardiogram recordings (20 OSA patient recording and 10 healthy subjects) with apnea or normal label in 1-min segment. All segments were divided into three groups: N-N group (normal segments of normal subjects), P-N group (normal segments of OSA subjects) and P-OSA group (apnea segments of OSA subjects). Frequency domain indices and interaction indices were extracted from segmented RR intervals. Frequency domain indices included nuLF, nuHF, and LF/HF ratio; interaction indices included mutual information (MI) and transfer entropy (TE (H→L) and TE (L→H)). Our results demonstrated that LF/HF ratio was significant higher in P-OSA group than N-N group and P-N group. MI was significantly larger in P-OSA group than P-N group. TE (H→L) and TE (L→H) showed a significant decrease in P-OSA group, compared to P-N group and N-N group. TE (H→L) were significantly negative correlation with LF/HF ratio in P-N group (r = −0.789, p = 0.000) and P-OSA group (r = −0.661, p = 0.002). Our results indicated that MI and TE is powerful tools to evaluate sympathovagal modulation in OSA. Moreover, sympathovagal modulation is more imbalance in OSA patients while suffering from apnea event compared to free event. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Show Figures

Figure 1

418 KiB  
Article
Use of Information Measures and Their Approximations to Detect Predictive Gene-Gene Interaction
by Jan Mielniczuk and Marcin Rdzanowski
Entropy 2017, 19(1), 23; https://doi.org/10.3390/e19010023 - 07 Jan 2017
Cited by 5 | Viewed by 5417
Abstract
We reconsider the properties and relationships of the interaction information and its modified versions in the context of detecting the interaction of two SNPs for the prediction of a binary outcome when interaction information is positive. This property is called predictive interaction, and [...] Read more.
We reconsider the properties and relationships of the interaction information and its modified versions in the context of detecting the interaction of two SNPs for the prediction of a binary outcome when interaction information is positive. This property is called predictive interaction, and we state some new sufficient conditions for it to hold true. We also study chi square approximations to these measures. It is argued that interaction information is a different and sometimes more natural measure of interaction than the logistic interaction parameter especially when SNPs are dependent. We introduce a novel measure of predictive interaction based on interaction information and its modified version. In numerical experiments, which use copulas to model dependence, we study examples when the logistic interaction parameter is zero or close to zero for which predictive interaction is detected by the new measure, while it remains undetected by the likelihood ratio test. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Show Figures

Figure 1

2100 KiB  
Article
Methodology for Simulation and Analysis of Complex Adaptive Supply Network Structure and Dynamics Using Information Theory
by Joshua Rodewald, John Colombi, Kyle Oyama and Alan Johnson
Entropy 2016, 18(10), 367; https://doi.org/10.3390/e18100367 - 18 Oct 2016
Cited by 6 | Viewed by 5109
Abstract
Supply networks existing today in many industries can behave as complex adaptive systems making them more difficult to analyze and assess. Being able to fully understand both the complex static and dynamic structures of a complex adaptive supply network (CASN) are key to [...] Read more.
Supply networks existing today in many industries can behave as complex adaptive systems making them more difficult to analyze and assess. Being able to fully understand both the complex static and dynamic structures of a complex adaptive supply network (CASN) are key to being able to make more informed management decisions and prioritize resources and production throughout the network. Previous efforts to model and analyze CASN have been impeded by the complex, dynamic nature of the systems. However, drawing from other complex adaptive systems sciences, information theory provides a model-free methodology removing many of those barriers, especially concerning complex network structure and dynamics. With minimal information about the network nodes, transfer entropy can be used to reverse engineer the network structure while local transfer entropy can be used to analyze the network structure’s dynamics. Both simulated and real-world networks were analyzed using this methodology. Applying the methodology to CASNs allows the practitioner to capitalize on observations from the highly multidisciplinary field of information theory which provides insights into CASN’s self-organization, emergence, stability/instability, and distributed computation. This not only provides managers with a more thorough understanding of a system’s structure and dynamics for management purposes, but also opens up research opportunities into eventual strategies to monitor and manage emergence and adaption within the environment. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Show Figures

Figure 1

2252 KiB  
Article
Inferring Weighted Directed Association Networks from Multivariate Time Series with the Small-Shuffle Symbolic Transfer Entropy Spectrum Method
by Yanzhu Hu, Huiyang Zhao and Xinbo Ai
Entropy 2016, 18(9), 328; https://doi.org/10.3390/e18090328 - 07 Sep 2016
Cited by 5 | Viewed by 5692
Abstract
Complex network methodology is very useful for complex system exploration. However, the relationships among variables in complex systems are usually not clear. Therefore, inferring association networks among variables from their observed data has been a popular research topic. We propose a method, named [...] Read more.
Complex network methodology is very useful for complex system exploration. However, the relationships among variables in complex systems are usually not clear. Therefore, inferring association networks among variables from their observed data has been a popular research topic. We propose a method, named small-shuffle symbolic transfer entropy spectrum (SSSTES), for inferring association networks from multivariate time series. The method can solve four problems for inferring association networks, i.e., strong correlation identification, correlation quantification, direction identification and temporal relation identification. The method can be divided into four layers. The first layer is the so-called data layer. Data input and processing are the things to do in this layer. In the second layer, we symbolize the model data, original data and shuffled data, from the previous layer and calculate circularly transfer entropy with different time lags for each pair of time series variables. Thirdly, we compose transfer entropy spectrums for pairwise time series with the previous layer’s output, a list of transfer entropy matrix. We also identify the correlation level between variables in this layer. In the last layer, we build a weighted adjacency matrix, the value of each entry representing the correlation level between pairwise variables, and then get the weighted directed association network. Three sets of numerical simulated data from a linear system, a nonlinear system and a coupled Rossler system are used to show how the proposed approach works. Finally, we apply SSSTES to a real industrial system and get a better result than with two other methods. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Show Figures

Figure 1

Back to TopTop