entropy-logo

Journal Browser

Journal Browser

Information Processing in Complex Systems

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (15 March 2015) | Viewed by 72408

Special Issue Editor

Computational Science, Faculty of Science, The University of Amsterdam, The Netherlands
Interests: information theory; statistical mechanics; complex systems; complex networks; dynamics on networks; theory of computation; theoretical computer science; formal languages; information geometry
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.

Dr. Rick Quax
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • statistical mechanics
  • complex systems
  • complex networks
  • dynamics on networks
  • theory of computation
  • theoretical computer science
  • formal languages
  • information geometry

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

2884 KiB  
Article
Conspiratorial Beliefs Observed through Entropy Principles
by Nataša Golo and Serge Galam
Entropy 2015, 17(8), 5611-5634; https://doi.org/10.3390/e17085611 - 4 Aug 2015
Cited by 2 | Viewed by 6381
Abstract
We propose a novel approach framed in terms of information theory and entropy to tackle the issue of the propagation of conspiracy theories. We represent the initial report of an event (such as the 9/11 terroristic attack) as a series of strings of [...] Read more.
We propose a novel approach framed in terms of information theory and entropy to tackle the issue of the propagation of conspiracy theories. We represent the initial report of an event (such as the 9/11 terroristic attack) as a series of strings of information, each string classified by a two-state variable Ei = ±1, i = 1, …, N. If the values of the Ei are set to −1 for all strings, a state of minimum entropy is achieved. Comments on the report, focusing repeatedly on several strings Ek, might alternate their meaning (from −1 to +1). The representation of the event is turned fuzzy with an increased entropy value. Beyond some threshold value of entropy, chosen by simplicity to its maximum value, meaning N/2 variables with Ei = 1, the chance is created that a conspiracy theory might be initiated/propagated. Therefore, the evolution of the associated entropy is a way to measure the degree of penetration of a conspiracy theory. Our general framework relies on online content made voluntarily available by crowds of people, in response to some news or blog articles published by official news agencies. We apply different aggregation levels (comment, person, discussion thread) and discuss the associated patterns of entropy change. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

Figure 1

2021 KiB  
Article
Prebiotic Competition between Information Variants, With Low Error Catastrophe Risks
by Radu Popa and Vily Marius Cimpoiasu
Entropy 2015, 17(8), 5274-5287; https://doi.org/10.3390/e17085274 - 27 Jul 2015
Cited by 1 | Viewed by 4255
Abstract
During competition for resources in primitive networks increased fitness of an information variant does not necessarily equate with successful elimination of its competitors. If variability is added fast to a system, speedy replacement of pre-existing and less-efficient forms of order is required as [...] Read more.
During competition for resources in primitive networks increased fitness of an information variant does not necessarily equate with successful elimination of its competitors. If variability is added fast to a system, speedy replacement of pre-existing and less-efficient forms of order is required as novel information variants arrive. Otherwise, the information capacity of the system fills up with information variants (an effect referred as “error catastrophe”). As the cost for managing the system’s exceeding complexity increases, the correlation between performance capabilities of information variants and their competitive success decreases, and evolution of such systems toward increased efficiency slows down. This impasse impedes the understanding of evolution in prebiotic networks. We used the simulation platform Biotic Abstract Dual Automata (BiADA) to analyze how information variants compete in a resource-limited space. We analyzed the effect of energy-related features (differences in autocatalytic efficiency, energy cost of order, energy availability, transformation rates and stability of order) on this competition. We discuss circumstances and controllers allowing primitive networks acquire novel information with minimal “error catastrophe” risks. We present a primitive mechanism for maximization of energy flux in dynamic networks. This work helps evaluate controllers of evolution in prebiotic networks and other systems where information variants compete. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

Figure 1

3093 KiB  
Article
Quantifying Redundant Information in Predicting a Target Random Variable
by Virgil Griffith and Tracey Ho
Entropy 2015, 17(7), 4644-4653; https://doi.org/10.3390/e17074644 - 2 Jul 2015
Cited by 24 | Viewed by 5795
Abstract
We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable [...] Read more.
We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

460 KiB  
Article
Maximum Entropy Rate Reconstruction of Markov Dynamics
by Gregor Chliamovitch, Alexandre Dupuis and Bastien Chopard
Entropy 2015, 17(6), 3738-3751; https://doi.org/10.3390/e17063738 - 8 Jun 2015
Cited by 8 | Viewed by 4930
Abstract
We develop ideas proposed by Van der Straeten to extend maximum entropy principles to Markov chains. We focus in particular on the convergence of such estimates in order to explain how our approach makes possible the estimation of transition probabilities when only short [...] Read more.
We develop ideas proposed by Van der Straeten to extend maximum entropy principles to Markov chains. We focus in particular on the convergence of such estimates in order to explain how our approach makes possible the estimation of transition probabilities when only short samples are available, which opens the way to applications to non-stationary processes. The current work complements an earlier communication by providing numerical details, as well as a full derivation of the multi-constraint two-state and three-state maximum entropy transition matrices. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

429 KiB  
Article
Tail Risk Constraints and Maximum Entropy
by Donald Geman, Hélyette Geman and Nassim Nicholas Taleb
Entropy 2015, 17(6), 3724-3737; https://doi.org/10.3390/e17063724 - 5 Jun 2015
Cited by 20 | Viewed by 12602
Abstract
Portfolio selection in the financial literature has essentially been analyzed under two central assumptions: full knowledge of the joint probability distribution of the returns of the securities that will comprise the target portfolio; and investors’ preferences are expressed through a utility function. In [...] Read more.
Portfolio selection in the financial literature has essentially been analyzed under two central assumptions: full knowledge of the joint probability distribution of the returns of the securities that will comprise the target portfolio; and investors’ preferences are expressed through a utility function. In the real world, operators build portfolios under risk constraints which are expressed both by their clients and regulators and which bear on the maximal loss that may be generated over a given time period at a given confidence level (the so-called Value at Risk of the position). Interestingly, in the finance literature, a serious discussion of how much or little is known from a probabilistic standpoint about the multi-dimensional density of the assets’ returns seems to be of limited relevance. Our approach in contrast is to highlight these issues and then adopt throughout a framework of entropy maximization to represent the real world ignorance of the “true” probability distributions, both univariate and multivariate, of traded securities’ returns. In this setting, we identify the optimal portfolio under a number of downside risk constraints. Two interesting results are exhibited: (i) the left- tail constraints are sufficiently powerful to override all other considerations in the conventional theory; (ii) the “barbell portfolio” (maximal certainty/ low risk in one set of holdings, maximal uncertainty in another), which is quite familiar to traders, naturally emerges in our construction. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

259 KiB  
Article
Information Decomposition and Synergy
by Eckehard Olbrich, Nils Bertschinger and Johannes Rauh
Entropy 2015, 17(5), 3501-3517; https://doi.org/10.3390/e17053501 - 22 May 2015
Cited by 47 | Viewed by 8536
Abstract
Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older [...] Read more.
Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

8499 KiB  
Article
An Information-Theoretic Perspective on Coarse-Graining, Including the Transition from Micro to Macro
by Kristian Lindgren
Entropy 2015, 17(5), 3332-3351; https://doi.org/10.3390/e17053332 - 14 May 2015
Cited by 5 | Viewed by 7352
Abstract
An information-theoretic perspective on coarse-graining is presented. It starts with an information characterization of configurations at the micro-level using a local information quantity that has a spatial average equal to a microscopic entropy. With a reversible micro dynamics, this entropy is conserved. In [...] Read more.
An information-theoretic perspective on coarse-graining is presented. It starts with an information characterization of configurations at the micro-level using a local information quantity that has a spatial average equal to a microscopic entropy. With a reversible micro dynamics, this entropy is conserved. In the micro-macro transition, it is shown how this local information quantity is transformed into a macroscopic entropy, as the local states are aggregated into macroscopic concentration variables. The information loss in this transition is identified, and the connection to the irreversibility of the macro dynamics and the second law of thermodynamics is discussed. This is then connected to a process of further coarse-graining towards higher characteristic length scales in the context of chemical reaction-diffusion dynamics capable of pattern formation. On these higher levels of coarse-graining, information flows across length scales and across space are defined. These flows obey a continuity equation for information, and they are connected to the thermodynamic constraints of the system, via an outflow of information from macroscopic to microscopic levels in the form of entropy production, as well as an inflow of information, from an external free energy source, if a spatial chemical pattern is to be maintained. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

2124 KiB  
Article
AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems
by Michael V. LeVine and Harel Weinstein
Entropy 2015, 17(5), 2895-2918; https://doi.org/10.3390/e17052895 - 7 May 2015
Cited by 16 | Viewed by 8976
Abstract
In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular [...] Read more.
In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular “action at a distance” is termed allostery. Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system’s underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

Graphical abstract

1874 KiB  
Article
Uncovering Discrete Non-Linear Dependence with Information Theory
by Anton Golub, Gregor Chliamovitch, Alexandre Dupuis and Bastien Chopard
Entropy 2015, 17(5), 2606-2623; https://doi.org/10.3390/e17052606 - 23 Apr 2015
Cited by 3 | Viewed by 5328
Abstract
In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss [...] Read more.
In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory content within a Markov process and determines its optimal order. The latter assesses the codependence among Markov processes. Both measurements are evaluated on toy examples and applied on high frequency foreign exchange data, focusing on 2008 financial crisis and 2010/2011 Euro crisis. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

165 KiB  
Article
Information-Theoretic Inference of Common Ancestors
by Bastian Steudel and Nihat Ay
Entropy 2015, 17(4), 2304-2327; https://doi.org/10.3390/e17042304 - 16 Apr 2015
Cited by 38 | Viewed by 7274
Abstract
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of [...] Read more.
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class that can be derived from observations of a subsystem only. To this end, we prove an information-theoretic inequality that allows for the inference of common ancestors of observed parts in any DAG representing some unknown larger system. More explicitly, we show that a large amount of dependence in terms of mutual information among the observations implies the existence of a common ancestor that distributes this information. Within the causal interpretation of DAGs, our result can be seen as a quantitative extension of Reichenbach’s principle of common cause to more than two variables. Our conclusions are valid also for non-probabilistic observations, such as binary strings, since we state the proof for an axiomatized notion of “mutual information” that includes the stochastic as well as the algorithmic version. Full article
(This article belongs to the Special Issue Information Processing in Complex Systems)
Show Figures

Back to TopTop