Synergy as the Failure of Distributivity
Abstract
:1. Introduction
2. Set-Theoretic Approach to Information
2.1. Basic Random Variable Operations
2.2. The Simplest Synergic System: XOR Gate
Probability | |||
0 | 0 | 0 | |
0 | 0 | 0 | 1 |
0 | 0 | 1 | 0 |
0 | 1 | 1 | |
0 | 1 | 0 | 0 |
1 | 0 | 1 | |
1 | 1 | 0 | |
0 | 1 | 1 | 1 |
2.3. Subdistributivity
2.4. Inclusion–Exclusion Formulas
2.5. Construction of Venn-Type Diagram for XOR Gate
2.6. Synergy as an Information Atom
3. General Trivariate Decomposition
3.1. Extended Random Variable Space
3.2. Set-Theoretic Solution
3.3. Main Result: Arbitrary Trivariate System
4. Towards a Multivariate Information Decomposition
4.1. Information Atoms Based on Part–Whole Relations
- Set-Theoretic Solution for N Variables
- XOR Gate
- N-Parity
4.2. Resolving the Partial Information Decomposition Self-Contradiction
5. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
PID | Partial Information Decomposition |
XOR | Exclusive OR |
RV | Random variable |
Appendix A. Properties of the Extended Random Variable (RV) Space
Appendix B. Information Atoms
- Set-Theoretic Solution
- XOR Gate
f | ||||||||
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | |
1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | |
1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | |
1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | |
1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | |
1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
f | ||
0 | 0 | |
0 | 0 | |
0 | 0 | |
0 | 0 | |
1 | 0 | |
1 | 0 | |
1 | 0 | |
1 | 0 | |
1 | 0 | |
1 | 0 | |
1 | 1 | |
1 | 1 | |
1 | 1 | |
1 | 1 |
- N-Parity
- Arbitrary Trivariate System
f | |||||||||
---|---|---|---|---|---|---|---|---|---|
1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | |
1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | |
1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | |
1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | |
1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | |
1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | |
1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | |
1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | |
1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
Appendix C. Partial Information Decomposition (PID)
- Monotonicity:
- Covering numbers:
- Information conservation law:
References
- Artime, O.; De Domenico, M. From the origin of life to pandemics: Emergent phenomena in complex systems. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2022, 380, 20200410. [Google Scholar] [CrossRef] [PubMed]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley Series in Telecommunications and Signal Processing; Wiley-Interscience: Hoboken, NJ, USA, 2006. [Google Scholar]
- Shannon, C. A Mathematical Theory of Communication. Bell Labs Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Williams, P.; Beer, R. Nonnegative Decomposition of Multivariate Information. arXiv 2010, arXiv:1004.2515. [Google Scholar]
- Harder, M.; Salge, C.; Polani, D. Bivariate measure of redundant information. Phys. Rev. E Stat. Nonlinear Soft Matter Phys. 2013, 87, 012130. [Google Scholar] [CrossRef] [PubMed]
- Bertschinger, N.; Rauh, J.; Olbrich, E.; Ay, N. Quantifying Unique Information. Entropy 2013, 16, 2161. [Google Scholar] [CrossRef]
- Kolchinsky, A. A Novel Approach to the Partial Information Decomposition. Entropy 2022, 24, 403. [Google Scholar] [CrossRef] [PubMed]
- Mediano, P.; Rosas, F.; Luppi, A.; Jensen, H.; Seth, A.; Barrett, A.; Carhart-Harris, R.; Bor, D. Greater than the parts: A review of the information decomposition approach to causal emergence. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2022, 380, 20210246. [Google Scholar] [CrossRef] [PubMed]
- Lizier, J.; Bertschinger, N.; Wibral, M. Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work. Entropy 2018, 20, 307. [Google Scholar] [CrossRef] [PubMed]
- Ehrlich, D.A.; Schick-Poland, K.; Makkeh, A.; Lanfermann, F.; Wollstadt, P.; Wibral, M. Partial information decomposition for continuous variables based on shared exclusions: Analytical formulation and estimation. Phys. Rev. E 2024, 110, 014115. [Google Scholar] [CrossRef] [PubMed]
- Schick-Poland, K.; Makkeh, A.; Gutknecht, A.; Wollstadt, P.; Sturm, A.; Wibral, M. A partial information decomposition for discrete and continuous variables. arXiv 2021, arXiv:2106.12393. [Google Scholar]
- Rosas, F.; Mediano, P.; Jensen, H.; Seth, A.; Barrett, A.; Carhart-Harris, R.; Bor, D. Reconciling emergences: An information-theoretic approach to identify causal emergence in multivariate data. PLoS Comput. Biol. 2020, 16, e1008289. [Google Scholar] [CrossRef] [PubMed]
- Balduzzi, D.; Tononi, G. Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework. PLoS Comput. Biol. 2008, 4, e1000091. [Google Scholar] [CrossRef] [PubMed]
- van Enk, S.J. Quantum partial information decomposition. Phys. Rev. A 2023, 108, 062415. [Google Scholar] [CrossRef]
- Rauh, J.; Bertschinger, N.; Olbrich, E. Reconsidering unique information: Towards a multivariate information decomposition. IEEE Int. Symp. Inf. Theory—Proc. 2014, 2014, 2232–2236. [Google Scholar] [CrossRef]
- Finn, C.; Lizier, J. Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices. Entropy 2018, 20, 297. [Google Scholar] [CrossRef] [PubMed]
- Ince, R.A.A. The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal. arXiv 2017, arXiv:1702.01591. [Google Scholar]
- Ting, H.K. On the Amount of Information. Theory Probab. Its Appl. 1962, 7, 439–447. [Google Scholar] [CrossRef]
- Tao, T. Special Cases of Shannon Entropy. Blogpost. 2017. Available online: https://terrytao.wordpress.com/2017/03/01/special-cases-of-shannon-entropy/ (accessed on 1 September 2024).
- Yeung, R. A new outlook on Shannon’s information measures. IEEE Trans. Inf. Theory 1991, 37, 466–474. [Google Scholar] [CrossRef]
- Lang, L.; Baudot, P.; Quax, R.; Forr’e, P. Information Decomposition Diagrams Applied beyond Shannon Entropy: A Generalization of Hu’s Theorem. arXiv 2022, arXiv:2202.09393. [Google Scholar]
- Mazur, D.R. AMS/MAA Textbooks; American Mathematical Society: Providence, RI, USA, 2010. [Google Scholar] [CrossRef]
- Tao, T. An Introduction to Measure Theory; American Mathematical Society: Providence, RI, USA, 2011; Volume 126. [Google Scholar]
- Wolf, S.; Wullschleger, J. Zero-error information and applications in cryptography. In Proceedings of the Information Theory Workshop, San Antonio, TX, USA, 24–29 October 2004; pp. 1–6. [Google Scholar]
- Stanley, R. Enumerative Combinatorics: Volume 1; Cambridge Studies in Advanced Mathematics; Cambridge University Press: Cambridge, UK, 1997. [Google Scholar]
- Gutknecht, A.; Wibral, M.; Makkeh, A. Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic. Proc. R. Soc. A Math. Phys. Eng. Sci. 2021, 477, 20210110. [Google Scholar] [CrossRef] [PubMed]
- Stoll, R. Set Theory and Logic; Dover Books on Advanced Mathematics; Dover Publications: Mineola, NY, USA, 1979. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sevostianov, I.; Feinerman, O. Synergy as the Failure of Distributivity. Entropy 2024, 26, 916. https://doi.org/10.3390/e26110916
Sevostianov I, Feinerman O. Synergy as the Failure of Distributivity. Entropy. 2024; 26(11):916. https://doi.org/10.3390/e26110916
Chicago/Turabian StyleSevostianov, Ivan, and Ofer Feinerman. 2024. "Synergy as the Failure of Distributivity" Entropy 26, no. 11: 916. https://doi.org/10.3390/e26110916
APA StyleSevostianov, I., & Feinerman, O. (2024). Synergy as the Failure of Distributivity. Entropy, 26(11), 916. https://doi.org/10.3390/e26110916