entropy-logo

Journal Browser

Journal Browser

Information in Dynamical Systems and Complex Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (28 February 2014) | Viewed by 65941

Special Issue Editors

Department of Mathematics, Clarkson University, Potsdam, NY 13699-5815, USA
Interests: dynamical systems; chaos theory; control of chaos; time-series analysis; Frobenius-Perron operators; stochastic dynamical systems; measurable dynamics; symbol dynamics; connections to information theory; image processing; data assimilation; connections between models and observed data; complex and networked coupled systems
Department of Mathematics, Clarkson University, Potsdam, NY 13699-5815, USA
Interests: dynamical systems; complex networks; information theory; time series analysis

Special Issue Information

Dear Colleagues,

From July 18-19, 2013, a workshop was held, entitled, Information in Dynamical Systems and Complex Systems, Summer 2013 Workshop in Burlington, VT with Organizers: Erik M. Bollt and Jie Sun (Clarkson University). Invited Attendees were, Erik Bollt, Ned Corron (U.S. Army), James Crutchfield (University of California, Davis),  David Feldman (College of the Atlantic), Adom Giffin (Clarkson University), Kevin Knuth (University at Albany, SUNY), Ying-Cheng Lai (Arizona State University), John Mahoney, (University of California, Merced), Konstantin Mischaikow, (Rutgers University), Edward Ott, (University of Maryland, College Park), Milan Palus, (Academy of Sciences of the Czech Republic), Shawn Pethel (U.S. Army), Maurizio Porfiri, (Polytechnic Institute of New York University), Samuel Stanton (U.S. Army), Jie Sun, James Yorke, (University of Maryland, College Park).

This special issue of Entropy will offer a venue to collect some of the synergy, consensus and collective thoughts on the topical themes stated for the session.  The following were the topics and themes of the workshop and the participants are invited to submit papers summarizing the collective discussions presented.

To that end, writings should take the form of,

  • Research articles related to the presentation given at the workshop.
  • Research articles related to the themes of the workshop stated below.
  • Commentaries on future directions of information and complexity in large scaled systems as related to some themes below.
  • Commentaries regarding connections and contrasts in the directions below.
  • Commentaries on wisdom from experience and theory related to misuses of concepts from tools from this area, the so-called, “stop the insanity” thoughts.
  • Discussions on other related themes connecting to the broader themes such as connections between observer and intrinsic based information and flow.

Workshop themes were stated as follows.  Given the modern focus of dynamical systems on coupled oscillators that form complex networks, it is important to move forward and explore these problems from the perspective of information content, flow and causality. The following general areas are central in this endeavor:

Information flow. In particular transfer entropy has gained a great deal of interest in recent years as future states may be conditioned on past states both with and without access to other stochastic processes as a test through Kullback-Lieber divergence, but recent work suggests that there are possible misinterpretations from the use of transfer entropy for causality inference.
Causality, and information signatures of causation. A central question in science is what causes outcomes of interest, and in particular for affecting and controlling outcomes this is even more important. From the perspective of information flow, causation and causal inference becomes particularly poignant.
Symmetries and reversibility, may be exploited in special circumstances to enlighten understanding of causality,structure, as well as clustering.
Scales, hierarchies, lead to understanding of relevance and nested topological partitions when defining a scale from which information symbols are defined. Variational and optimization principles of information in physics, in particular regarding maximum entropy principles that lead to understanding underlying laws.
Randomness, structure and causality. In some sense randomness may be described as external and unmodelled effects, which we may interpret in the context here as \unknown information.” This leads to:
Hidden states and hidden processes, including such methods as hidden Markov models and more generally Bayesian inference methods. In the context of information content of a dynamical system, such perspective should potentially yield better understanding.
Measures and metrics of complexity and information content. The phrase “complexity” is commonly used for a wide variety of systems, behaviors and processes, and yet a commonly agreed description as to what the phrase means is lacking.
Physical laws as information filters or algorithms. Since physical laws lead to evolution equations, which from the perspective of this discussion defines evolution from some information state to a new information state, then it can be said that physical laws may be described either as algorithms or information filters that translate states.
Some questions to consider:
Can we develop a general mechanistic description of what renders a real complex system different from a large but perhaps simpler system (particularly from an information theory perspective)?
Can physical laws be defined in an algorithmic and information theoretic manner?
Identify engineering applications, especially those that benefit directly from information theory perspective and methods.
How can this perspective impact design?
Can specific control methods be developed that benefit?
Are piecewise impulsive systems from mechanical as well as electronic engineering design particularly well suited?
Can group behaviors and cooperative behaviors such as those of animals and humans be better understood in terms of information theoretic descriptions? What role does hierarchical structures come into play?
Can synchronization be understood as the counter point to complex behavior?
Can methods designed to identify causal influences be adapted to further adjust and define control strategies for complex systems in biological, social, physical and engineering contexts?
Is there a minimal information description of a dynamical system that will facilitate engineering design? Does approximate description of the formal language suffice for approximate modeling lead to faster and easier design?

Discuss the validity of the popular approaches of information and entropy measures as a systems probe, change detection, damage detection and systems health monitoring.

Prof. Dr. Erik M Bollt
Dr. Jie Sun
Guest Editors

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

119 KiB  
Editorial
Editorial Comment on the Special Issue of “Information in Dynamical Systems and Complex Systems”
by Erik M. Bollt and Jie Sun
Entropy 2014, 16(9), 5068-5077; https://doi.org/10.3390/e16095068 - 23 Sep 2014
Cited by 7 | Viewed by 4673
Abstract
This special issue collects contributions from the participants of the “Information in Dynamical Systems and Complex Systems” workshop, which cover a wide range of important problems and new approaches that lie in the intersection of information theory and dynamical systems. The contributions include [...] Read more.
This special issue collects contributions from the participants of the “Information in Dynamical Systems and Complex Systems” workshop, which cover a wide range of important problems and new approaches that lie in the intersection of information theory and dynamical systems. The contributions include theoretical characterization and understanding of the different types of information flow and causality in general stochastic processes, inference and identification of coupling structure and parameters of system dynamics, rigorous coarse-grain modeling of network dynamical systems, and exact statistical testing of fundamental information-theoretic quantities such as the mutual information. The collective efforts reported here in reflect a modern perspective of the intimate connection between dynamical systems and information flow, leading to the promise of better understanding and modeling of natural complex systems and better/optimal design of engineering systems. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)

Research

Jump to: Editorial

804 KiB  
Article
Cross-Scale Interactions and Information Transfer
by Milan Paluš
Entropy 2014, 16(10), 5263-5289; https://doi.org/10.3390/e16105263 - 10 Oct 2014
Cited by 27 | Viewed by 7762
Abstract
An information-theoretic approach for detecting interactions and informationtransfer between two systems is extended to interactions between dynamical phenomenaevolving on different time scales of a complex, multiscale process. The approach isdemonstrated in the detection of an information transfer from larger to smaller time scales [...] Read more.
An information-theoretic approach for detecting interactions and informationtransfer between two systems is extended to interactions between dynamical phenomenaevolving on different time scales of a complex, multiscale process. The approach isdemonstrated in the detection of an information transfer from larger to smaller time scales ina model multifractal process and applied in a study of cross-scale interactions in atmosphericdynamics. Applying a form of the conditional mutual information and a statistical test basedon the Fourier transform and multifractal surrogate data to about a century long recordsof daily mean surface air temperature from various European locations, an informationtransfer from larger to smaller time scales has been observed as the influence of the phaseof slow oscillatory phenomena with the periods around 6–11 years on the amplitudes of thevariability characterized by the smaller temporal scales from a few months to 4–5 years.These directed cross-scale interactions have a non-negligible effect on interannual airtemperature variability in a large area of Europe. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

352 KiB  
Article
Simultaneous State and Parameter Estimation Using Maximum Relative Entropy with Nonhomogenous Differential Equation Constraints
by Adom Giffin and Renaldas Urniezius
Entropy 2014, 16(9), 4974-4991; https://doi.org/10.3390/e16094974 - 17 Sep 2014
Cited by 12 | Viewed by 5629
Abstract
In this paper, we continue our efforts to show how maximum relative entropy (MrE) can be used as a universal updating algorithm. Here, our purpose is to tackle a joint state and parameter estimation problem where our system is nonlinear and in a [...] Read more.
In this paper, we continue our efforts to show how maximum relative entropy (MrE) can be used as a universal updating algorithm. Here, our purpose is to tackle a joint state and parameter estimation problem where our system is nonlinear and in a non-equilibrium state, i.e., perturbed by varying external forces. Traditional parameter estimation can be performed by using filters, such as the extended Kalman filter (EKF). However, as shown with a toy example of a system with first order non-homogeneous ordinary differential equations, assumptions made by the EKF algorithm (such as the Markov assumption) may not be valid. The problem can be solved with exponential smoothing, e.g., exponentially weighted moving average (EWMA). Although this has been shown to produce acceptable filtering results in real exponential systems, it still cannot simultaneously estimate both the state and its parameters and has its own assumptions that are not always valid, for example when jump discontinuities exist. We show that by applying MrE as a filter, we can not only develop the closed form solutions, but we can also infer the parameters of the differential equation simultaneously with the means. This is useful in real, physical systems, where we want to not only filter the noise from our measurements, but we also want to simultaneously infer the parameters of the dynamics of a nonlinear and non-equilibrium system. Although there were many assumptions made throughout the paper to illustrate that EKF and exponential smoothing are special cases ofMrE, we are not “constrained”, by these assumptions. In other words, MrE is completely general and can be used in broader ways. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

2318 KiB  
Article
Information Anatomy of Stochastic Equilibria
by Sarah Marzen and James P. Crutchfield
Entropy 2014, 16(9), 4713-4748; https://doi.org/10.3390/e16094713 - 25 Aug 2014
Cited by 14 | Viewed by 6055
Abstract
A stochastic nonlinear dynamical system generates information, as measured by its entropy rate. Some—the ephemeral information—is dissipated and some—the bound information—is actively stored and so affects future behavior. We derive analytic expressions for the ephemeral and bound information in the limit of infinitesimal [...] Read more.
A stochastic nonlinear dynamical system generates information, as measured by its entropy rate. Some—the ephemeral information—is dissipated and some—the bound information—is actively stored and so affects future behavior. We derive analytic expressions for the ephemeral and bound information in the limit of infinitesimal time discretization for two classical systems that exhibit dynamical equilibria: first-order Langevin equations (i) where the drift is the gradient of an analytic potential function and the diffusion matrix is invertible and (ii) with a linear drift term (Ornstein–Uhlenbeck), but a noninvertible diffusion matrix. In both cases, the bound information is sensitive to the drift and diffusion, while the ephemeral information is sensitive only to the diffusion matrix and not to the drift. Notably, this information anatomy changes discontinuously as any of the diffusion coefficients vanishes, indicating that it is very sensitive to the noise structure. We then calculate the information anatomy of the stochastic cusp catastrophe and of particles diffusing in a heat bath in the overdamped limit, both examples of stochastic gradient descent on a potential landscape. Finally, we use our methods to calculate and compare approximations for the time-local predictive information for adaptive agents. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

Graphical abstract

339 KiB  
Article
Identifying Chaotic FitzHugh–Nagumo Neurons Using Compressive Sensing
by Ri-Qi Su, Ying-Cheng Lai and Xiao Wang
Entropy 2014, 16(7), 3889-3902; https://doi.org/10.3390/e16073889 - 15 Jul 2014
Cited by 15 | Viewed by 6438
Abstract
We develop a completely data-driven approach to reconstructing coupled neuronal networks that contain a small subset of chaotic neurons. Such chaotic elements can be the result of parameter shift in their individual dynamical systems and may lead to abnormal functions of the network. [...] Read more.
We develop a completely data-driven approach to reconstructing coupled neuronal networks that contain a small subset of chaotic neurons. Such chaotic elements can be the result of parameter shift in their individual dynamical systems and may lead to abnormal functions of the network. To accurately identify the chaotic neurons may thus be necessary and important, for example, applying appropriate controls to bring the network to a normal state. However, due to couplings among the nodes, the measured time series, even from non-chaotic neurons, would appear random, rendering inapplicable traditional nonlinear time-series analysis, such as the delay-coordinate embedding method, which yields information about the global dynamics of the entire network. Our method is based on compressive sensing. In particular, we demonstrate that identifying chaotic elements can be formulated as a general problem of reconstructing the nodal dynamical systems, network connections and all coupling functions, as well as their weights. The working and efficiency of the method are illustrated by using networks of non-identical FitzHugh–Nagumo neurons with randomly-distributed coupling weights. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

1396 KiB  
Article
Identifying the Coupling Structure in Complex Systems through the Optimal Causation Entropy Principle
by Jie Sun, Carlo Cafaro and Erik M. Bollt
Entropy 2014, 16(6), 3416-3433; https://doi.org/10.3390/e16063416 - 20 Jun 2014
Cited by 35 | Viewed by 5919
Abstract
Inferring the coupling structure of complex systems from time series data in general by means of statistical and information-theoretic techniques is a challenging problem in applied science. The reliability of statistical inferences requires the construction of suitable information-theoretic measures that take into account [...] Read more.
Inferring the coupling structure of complex systems from time series data in general by means of statistical and information-theoretic techniques is a challenging problem in applied science. The reliability of statistical inferences requires the construction of suitable information-theoretic measures that take into account both direct and indirect influences, manifest in the form of information flows, between the components within the system. In this work, we present an application of the optimal causation entropy (oCSE) principle to identify the coupling structure of a synthetic biological system, the repressilator. Specifically, when the system reaches an equilibrium state, we use a stochastic perturbation approach to extract time series data that approximate a linear stochastic process. Then, we present and jointly apply the aggregative discovery and progressive removal algorithms based on the oCSE principle to infer the coupling structure of the system from the measured data. Finally, we show that the success rate of our coupling inferences not only improves with the amount of available data, but it also increases with a higher frequency of sampling and is especially immune to false positives. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

601 KiB  
Article
Coarse Dynamics for Coarse Modeling: An Example From Population Biology
by Justin Bush and Konstantin Mischaikow
Entropy 2014, 16(6), 3379-3400; https://doi.org/10.3390/e16063379 - 19 Jun 2014
Cited by 9 | Viewed by 4631
Abstract
Networks have become a popular way to concisely represent complex nonlinear systems where the interactions and parameters are imprecisely known. One challenge is how best to describe the associated dynamics, which can exhibit complicated behavior sensitive to small changes in parameters. A recently [...] Read more.
Networks have become a popular way to concisely represent complex nonlinear systems where the interactions and parameters are imprecisely known. One challenge is how best to describe the associated dynamics, which can exhibit complicated behavior sensitive to small changes in parameters. A recently developed computational approach that we refer to as a database for dynamics provides a robust and mathematically rigorous description of global dynamics over large ranges of parameter space. To demonstrate the potential of this approach we consider two classical age-structured population models that share the same network diagram and have a similar nonlinear overcompensatory term, but nevertheless yield different patterns of qualitative behavior as a function of parameters. Using a generalization of these models we relate the different structure of the dynamics that are observed in the context of biologically relevant questions such as stable oscillations in populations, bistability, and permanence. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

126 KiB  
Article
Exact Test of Independence Using Mutual Information
by Shawn D. Pethel and Daniel W. Hahs
Entropy 2014, 16(5), 2839-2849; https://doi.org/10.3390/e16052839 - 23 May 2014
Cited by 21 | Viewed by 7755
Abstract
Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The [...] Read more.
Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being that the mutual information is zero (i.e., independence). The exact tests reported in the literature assume that data samples for each variable are sequentially independent and identically distributed (iid). In general, time series data have dependencies (Markov structure) that violate this condition. The algorithm given in this paper is the first exact significance test of mutual information that takes into account the Markov structure. When the Markov order is not known or indefinite, an exact test is used to determine an effective Markov order. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

484 KiB  
Article
Infinite Excess Entropy Processes with Countable-State Generators
by Nicholas F. Travers and James P. Crutchfield
Entropy 2014, 16(3), 1396-1413; https://doi.org/10.3390/e16031396 - 10 Mar 2014
Cited by 19 | Viewed by 5869
Abstract
We present two examples of finite-alphabet, infinite excess entropy processes generated by stationary hidden Markov models (HMMs) with countable state sets. The first, simpler example is not ergodic, but the second is. These are the first explicit constructions of processes of this type. [...] Read more.
We present two examples of finite-alphabet, infinite excess entropy processes generated by stationary hidden Markov models (HMMs) with countable state sets. The first, simpler example is not ergodic, but the second is. These are the first explicit constructions of processes of this type. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

715 KiB  
Article
Information Flow in Animal-Robot Interactions
by Sachit Butail, Fabrizio Ladu, Davide Spinello and Maurizio Porfiri
Entropy 2014, 16(3), 1315-1330; https://doi.org/10.3390/e16031315 - 28 Feb 2014
Cited by 62 | Viewed by 10682
Abstract
The nonverbal transmission of information between social animals is a primary driving force behind their actions and, therefore, an important quantity to measure in animal behavior studies. Despite its key role in social behavior, the flow of information has only been inferred by [...] Read more.
The nonverbal transmission of information between social animals is a primary driving force behind their actions and, therefore, an important quantity to measure in animal behavior studies. Despite its key role in social behavior, the flow of information has only been inferred by correlating the actions of individuals with a simplifying assumption of linearity. In this paper, we leverage information-theoretic tools to relax this assumption. To demonstrate the feasibility of our approach, we focus on a robotics-based experimental paradigm, which affords consistent and controllable delivery of visual stimuli to zebrafish. Specifically, we use a robotic arm to maneuver a life-sized replica of a zebrafish in a predetermined trajectory as it interacts with a focal subject in a test tank. We track the fish and the replica through time and use the resulting trajectory data to measure the transfer entropy between the replica and the focal subject, which, in turn, is used to quantify one-directional information flow from the robot to the fish. In agreement with our expectations, we find that the information flow from the replica to the zebrafish is significantly more than the other way around. Notably, such information is specifically related to the response of the fish to the replica, whereby we observe that the information flow is reduced significantly if the motion of the replica is randomly delayed in a surrogate dataset. In addition, comparison with a control experiment, where the replica is replaced by a conspecific, shows that the information flow toward the focal fish is significantly more for a robotic than a live stimulus. These findings support the reliability of using transfer entropy as a measure of information flow, while providing indirect evidence for the efficacy of a robotics-based platform in animal behavioral studies. Full article
(This article belongs to the Special Issue Information in Dynamical Systems and Complex Systems)
Show Figures

Back to TopTop