entropy-logo

Journal Browser

Journal Browser

Synergy and Redundancy Measures: Theory and Applications to Characterize Complex Systems and Shape Neural Network Representations

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 10 October 2024 | Viewed by 5526

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science, School of Science & Technology, City, University of London, London EC1V 0HB, UK
Interests: data analysis; causal inference; dimensionality reduction; neuroscience; sensitivity analysis; structure learning; information decomposition; information bottleneck

Special Issue Information

Dear Colleagues,

An important aspect of how sources of information are distributed across a set of variables concerns whether different variables provide redundant, unique, or synergistic information when combined with other variables. Intuitively, variables share redundant information if each variable carries individually the same information carried by other variables. Information carried by a certain variable is unique if it is not carried by any other variables or their combination, and a group of variables carries synergistic information if some information arises only when they are combined.

Recent advances have contributed toward building an information-theoretic framework to determine the distribution and nature of information extractable from multivariate data sets. Measures of redundant, unique, or synergistic information characterize dependencies between the parts of a multivariate system and can help to understand its function and mechanisms. Furthermore, these measures are also useful to analyze how information is distributed across layers in neural networks or can be used as cost functions to shape the structure of data representations learned by the networks.

This Special Issue welcomes contributions on advances in both the theoretical formulation and applications of information-theoretic measures of synergy and redundancy. Encompassed topics include:

  • Advances in a multivariate formulation of redundancy measures or in the comparison of alternative proposals, addressing their distinctive power to capture relevant structures in both synthetic and experimental data sets;
  • Applications to understand interactions in real complex systems;
  • Advances in the estimation of information-theoretic quantities from high-dimensional data sets;
  • Applications for feature selection and sensitivity analysis;
  • Analysis of the distribution and nature of information across layers in neural networks;
  • Design of deep learning models to obtain robust or disentangled data representations.

Dr. Daniel Chicharro
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • mutual information
  • synergy
  • redundancy
  • unique information
  • neural networks
  • disentanglement
  • feature extraction
  • representation learning
  • partial information decomposition

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

34 pages, 728 KiB  
Article
Causal Structure Learning with Conditional and Unique Information Groups-Decomposition Inequalities
by Daniel Chicharro and Julia K. Nguyen
Entropy 2024, 26(6), 440; https://doi.org/10.3390/e26060440 - 23 May 2024
Viewed by 253
Abstract
The causal structure of a system imposes constraints on the joint probability distribution of variables that can be generated by the system. Archetypal constraints consist of conditional independencies between variables. However, particularly in the presence of hidden variables, many causal structures are compatible [...] Read more.
The causal structure of a system imposes constraints on the joint probability distribution of variables that can be generated by the system. Archetypal constraints consist of conditional independencies between variables. However, particularly in the presence of hidden variables, many causal structures are compatible with the same set of independencies inferred from the marginal distributions of observed variables. Additional constraints allow further testing for the compatibility of data with specific causal structures. An existing family of causally informative inequalities compares the information about a set of target variables contained in a collection of variables, with a sum of the information contained in different groups defined as subsets of that collection. While procedures to identify the form of these groups-decomposition inequalities have been previously derived, we substantially enlarge the applicability of the framework. We derive groups-decomposition inequalities subject to weaker independence conditions, with weaker requirements in the configuration of the groups, and additionally allowing for conditioning sets. Furthermore, we show how constraints with higher inferential power may be derived with collections that include hidden variables, and then converted into testable constraints using data processing inequalities. For this purpose, we apply the standard data processing inequality of conditional mutual information and derive an analogous property for a measure of conditional unique information recently introduced to separate redundant, synergistic, and unique contributions to the information that a set of variables has about a target. Full article
Show Figures

Figure 1

50 pages, 652 KiB  
Article
Non-Negative Decomposition of Multivariate Information: From Minimum to Blackwell-Specific Information
by Tobias Mages, Elli Anastasiadi and Christian Rohner
Entropy 2024, 26(5), 424; https://doi.org/10.3390/e26050424 - 15 May 2024
Viewed by 426
Abstract
Partial information decompositions (PIDs) aim to categorize how a set of source variables provides information about a target variable redundantly, uniquely, or synergetically. The original proposal for such an analysis used a lattice-based approach and gained significant attention. However, finding a suitable underlying [...] Read more.
Partial information decompositions (PIDs) aim to categorize how a set of source variables provides information about a target variable redundantly, uniquely, or synergetically. The original proposal for such an analysis used a lattice-based approach and gained significant attention. However, finding a suitable underlying decomposition measure is still an open research question at an arbitrary number of discrete random variables. This work proposes a solution with a non-negative PID that satisfies an inclusion–exclusion relation for any f-information measure. The decomposition is constructed from a pointwise perspective of the target variable to take advantage of the equivalence between the Blackwell and zonogon order in this setting. Zonogons are the Neyman–Pearson region for an indicator variable of each target state, and f-information is the expected value of quantifying its boundary. We prove that the proposed decomposition satisfies the desired axioms and guarantees non-negative partial information results. Moreover, we demonstrate how the obtained decomposition can be transformed between different decomposition lattices and that it directly provides a non-negative decomposition of Rényi-information at a transformed inclusion–exclusion relation. Finally, we highlight that the decomposition behaves differently depending on the information measure used and how it can be used for tracing partial information flows through Markov chains. Full article
Show Figures

Figure 1

24 pages, 790 KiB  
Article
A Measure of Synergy Based on Union Information
by André F. C. Gomes and Mário A. T. Figueiredo
Entropy 2024, 26(3), 271; https://doi.org/10.3390/e26030271 - 19 Mar 2024
Viewed by 889
Abstract
The partial information decomposition (PID) framework is concerned with decomposing the information that a set of (two or more) random variables (the sources) has about another variable (the target) into three types of information: unique, redundant, and synergistic. Classical information theory alone does [...] Read more.
The partial information decomposition (PID) framework is concerned with decomposing the information that a set of (two or more) random variables (the sources) has about another variable (the target) into three types of information: unique, redundant, and synergistic. Classical information theory alone does not provide a unique way to decompose information in this manner and additional assumptions have to be made. One often overlooked way to achieve this decomposition is using a so-called measure of union information—which quantifies the information that is present in at least one of the sources—from which a synergy measure stems. In this paper, we introduce a new measure of union information based on adopting a communication channel perspective, compare it with existing measures, and study some of its properties. We also include a comprehensive critical review of characterizations of union information and synergy measures that have been proposed in the literature. Full article
Show Figures

Figure 1

11 pages, 353 KiB  
Article
Conditioning in Tropical Probability Theory
by Rostislav Matveev and Jacobus W. Portegies
Entropy 2023, 25(12), 1641; https://doi.org/10.3390/e25121641 - 9 Dec 2023
Cited by 1 | Viewed by 791
Abstract
We define a natural operation of conditioning of tropical diagrams of probability spaces and show that it is Lipschitz continuous with respect to the asymptotic entropy distance. Full article
17 pages, 470 KiB  
Article
Arrow Contraction and Expansion in Tropical Diagrams
by Rostislav Matveev and Jacobus W. Portegies
Entropy 2023, 25(12), 1637; https://doi.org/10.3390/e25121637 - 8 Dec 2023
Viewed by 621
Abstract
Arrow contraction applied to a tropical diagram of probability spaces is a modification of the diagram, replacing one of the morphisms with an isomorphism while preserving other parts of the diagram. It is related to the rate regions introduced by Ahlswede and Körner. [...] Read more.
Arrow contraction applied to a tropical diagram of probability spaces is a modification of the diagram, replacing one of the morphisms with an isomorphism while preserving other parts of the diagram. It is related to the rate regions introduced by Ahlswede and Körner. In a companion article, we use arrow contraction to derive information about the shape of the entropic cone. Arrow expansion is the inverse operation to the arrow contraction. Full article
Show Figures

Figure 1

27 pages, 491 KiB  
Article
Decomposing and Tracing Mutual Information by Quantifying Reachable Decision Regions
by Tobias Mages and Christian Rohner
Entropy 2023, 25(7), 1014; https://doi.org/10.3390/e25071014 - 30 Jun 2023
Cited by 1 | Viewed by 1260
Abstract
The idea of a partial information decomposition (PID) gained significant attention for attributing the components of mutual information from multiple variables about a target to being unique, redundant/shared or synergetic. Since the original measure for this analysis was criticized, several alternatives have been [...] Read more.
The idea of a partial information decomposition (PID) gained significant attention for attributing the components of mutual information from multiple variables about a target to being unique, redundant/shared or synergetic. Since the original measure for this analysis was criticized, several alternatives have been proposed but have failed to satisfy the desired axioms, an inclusion–exclusion principle or have resulted in negative partial information components. For constructing a measure, we interpret the achievable type I/II error pairs for predicting each state of a target variable (reachable decision regions) as notions of pointwise uncertainty. For this representation of uncertainty, we construct a distributive lattice with mutual information as consistent valuation and obtain an algebra for the constructed measure. The resulting definition satisfies the original axioms, an inclusion–exclusion principle and provides a non-negative decomposition for an arbitrary number of variables. We demonstrate practical applications of this approach by tracing the flow of information through Markov chains. This can be used to model and analyze the flow of information in communication networks or data processing systems. Full article
Show Figures

Figure 1

Back to TopTop