Multiscale Information Theory and the Marginal Utility of Information
AbstractComplex systems display behavior at a range of scales. Large-scale behaviors can emerge from the correlated or dependent behavior of individual small-scale components. To capture this observation in a rigorous and general way, we introduce a formalism for multiscale information theory. Dependent behavior among system components results in overlapping or shared information. A system’s structure is revealed in the sharing of information across the system’s dependencies, each of which has an associated scale. Counting information according to its scale yields the quantity of scale-weighted information, which is conserved when a system is reorganized. In the interest of flexibility we allow information to be quantified using any function that satisfies two basic axioms. Shannon information and vector space dimension are examples. We discuss two quantitative indices that summarize system structure: an existing index, the complexity profile, and a new index, the marginal utility of information. Using simple examples, we show how these indices capture the multiscale structure of complex systems in a quantitative way. View Full-Text
Share & Cite This Article
Allen, B.; Stacey, B.C.; Bar-Yam, Y. Multiscale Information Theory and the Marginal Utility of Information. Entropy 2017, 19, 273.
Allen B, Stacey BC, Bar-Yam Y. Multiscale Information Theory and the Marginal Utility of Information. Entropy. 2017; 19(6):273.Chicago/Turabian Style
Allen, Benjamin; Stacey, Blake C.; Bar-Yam, Yaneer. 2017. "Multiscale Information Theory and the Marginal Utility of Information." Entropy 19, no. 6: 273.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.