entropy-logo

Journal Browser

Journal Browser

Information-Theoretic Methods in Computational Neuroscience

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (20 November 2025) | Viewed by 2748

Special Issue Editors


E-Mail Website
Guest Editor
W. M. Keck Science Department of Pitzer, Scripps, and Claremont McKenna College, Claremont, CA 91711, USA
Interests: information theory; biophysics; machine learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Physics, Indiana University, Bloomington, IN 47405, USA
Interests: biophysics; computational neuroscience; statistical physics; neural networks

E-Mail Website
Guest Editor Assistant
Department of Clinical Neurophysiology, University of Twente, P.O. Box 217, 7500 AE Enschede, The Netherlands
Interests: neuroscience; neurophysiology; memory; electrophysiology in-vitro

E-Mail Website
Guest Editor Assistant
Center for the Physics of Biological Function, Princeton University, Princeton, NJ 08544, USA
Interests: information theory; computational neuroscience

Special Issue Information

Dear Colleagues,

Information theory has been an invaluable tool for neuroscience since its conception in the 1940s, with successes ranging from quantifying the rate of information transmission of sensory neurons to the highly influential normative theory of efficient coding to characterizing the interactions within neural populations via maximum entropy models. The present era of experimental neuroscience, marked by increasingly high-dimensional neural and behavioral recordings, poses a particular challenge for information theoretic methods, which typically scale poorly with the dimensionality of the data. At the same time, these rich datasets promise to resolve decades-old questions about the nature of the neural code: information-theoretic methods for understanding the purpose of neural systems using normative theories, such as rate-distortion theory and constrained channel capacity calculations, promise to answer the tough questions of what organisms are trying to do and how they’re doing it.

We welcome original research and reviews that focus on the role of information theory in neuroscience in any way, shape, or form. Examples of topics that may be of interest include:

  • The inference and usage of maximum entropy models and stimulus-dependent maximum entropy models;
  • The estimation, usage, and interpretation of information-theoretic quantities to benchmark how well neural systems communicate information;
  • The development of novel information-theoretic quantities for understanding neural systems;
  • Explorations of criticality in neural systems, including optimal information processing capabilities and their connection to criticality and its connection to partial information decomposition and other novel information-theoretic quantities;
  • New methods for the estimation of information-theoretic quantities or objectives that pertain to neural systems, particularly those methods that scale to high-dimensional data;
  • Information-theoretic normative theories such as rate-distortion theory and its variants or noisy constrained channel coding to understand neural systems, especially in the style of the efficient coding hypothesis.

If there are topics we have missed, they must be especially important! We encourage all authors to submit. If you are wondering if your work fits the scope of the Special Issue, please contact us.

Dr. Sarah Marzen
Prof. Dr. John Beggs
Guest Editors

Dr. Martina Lamberti
Dr. Jared Salisbury
Guest Editor Assistants

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • maximum entropy models
  • information theory
  • criticality in neural systems
  • information-theoretic quantities
  • Information-theoretic normative theories

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

25 pages, 3902 KB  
Article
Perceived Complexity as Normalized, Integrated, Localized Shannon Entropy
by Sébastien Berquet and Norberto M. Grzywacz
Entropy 2026, 28(3), 279; https://doi.org/10.3390/e28030279 - 1 Mar 2026
Viewed by 284
Abstract
Perceived complexity is a key component of sensory brain function as it indicates the number of resources necessary to process incoming information. A recently proposed measure of perceived complexity defined it as normalized Shannon entropy. However, the proposal used probability distributions estimated from [...] Read more.
Perceived complexity is a key component of sensory brain function as it indicates the number of resources necessary to process incoming information. A recently proposed measure of perceived complexity defined it as normalized Shannon entropy. However, the proposal used probability distributions estimated from the entire sensory signal at once. Here, we first used synthetically created images and abstract expressionism art to show that using such distributions seemed incompatible with perceived complexity. This incompatibility persisted even if we performed the calculations at different scales, that is, in multiple image resolutions. We then proposed an alternate theory that postulated that perceived complexity arose from the integration of localized Shannon entropy. The outcome of this integration was then normalized to define an index of complexity. We measured this index and integrated Shannon entropy in 704 images obtained from natural and urban settings, painted by well-known artists, or created synthetically. Moreover, we studied the dependence of these measurements on the spatial scale used to measure Localized Shannon Entropy. We found that normalized, integrated, localized Shannon entropy at low spatial scales is consistent with the phenomenology of perceived complexity and illustrates interesting aesthetic choices of different artists. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Computational Neuroscience)
Show Figures

Figure 1

25 pages, 2936 KB  
Article
Understanding Schizophrenia Pathophysiology via fMRI-Based Information Theory and Multiplex Network Analysis
by Fabrizio Parente
Entropy 2026, 28(1), 83; https://doi.org/10.3390/e28010083 - 10 Jan 2026
Viewed by 682
Abstract
This work investigates the mechanisms of information transfer underlying causal relationships between brain regions during resting-state conditions in patients with schizophrenia (SCZ). A large fMRI dataset including healthy controls and SCZ patients was analyzed to estimate directed information flow using local Transfer Entropy [...] Read more.
This work investigates the mechanisms of information transfer underlying causal relationships between brain regions during resting-state conditions in patients with schizophrenia (SCZ). A large fMRI dataset including healthy controls and SCZ patients was analyzed to estimate directed information flow using local Transfer Entropy (TE). Four functional interaction patterns—referred to as rules—were identified between brain regions: activation in the same state (ActS), activation in the opposite state (ActO), turn-off in the same state (TfS), and turn-off in the opposite state (TfO), indicating a dynamics toward converging (Acts/Tfs = S) and diverging (ActO/TfO = O) states of brain regions. These interactions were integrated within a multiplex network framework, in which each rule was represented as a directed network layer. Our results reveal widespread alterations in the functional architecture of SCZ brain networks, particularly affecting schizophrenia-related systems such as bottom-up sensory pathways and associative cortical dynamics. An imbalance between S and O rules was observed, leading to reduced network stability. This shift results in a more randomized functional network organization. These findings provide a mechanistic link between excitation/inhibition (E/I) imbalance and mesoscopic network dysconnectivity, in agreement with previous dynamic functional connectivity and Dynamic Causal Modeling (DCM) studies. Overall, our approach offers an integrated framework for characterizing directed brain communication patterns and psychiatric phenotypes. Future work will focus on systematic comparisons with DCM and other functional connectivity methods. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Computational Neuroscience)
Show Figures

Figure 1

18 pages, 2710 KB  
Article
Eye Gaze Entropy Reflects Individual Experience in the Context of Driving
by Karina Arutyunova, Evgenii Burashnikov, Nikita Timakin, Ivan Shishalov, Andrei Filimonov and Anastasiia Bakhchina
Entropy 2026, 28(1), 8; https://doi.org/10.3390/e28010008 - 20 Dec 2025
Viewed by 964
Abstract
Eye gaze plays an essential role in the organisation of human goal-directed behaviour. Stationary gaze entropy and gaze transition entropy are two informative measures of visual scanning in different tasks. In this work, we discuss the benefits of these eye gaze entropy measures [...] Read more.
Eye gaze plays an essential role in the organisation of human goal-directed behaviour. Stationary gaze entropy and gaze transition entropy are two informative measures of visual scanning in different tasks. In this work, we discuss the benefits of these eye gaze entropy measures in the context of driving behaviour. In our large-scale study, participants performed driving tasks in a simulator (N = 380, 44% female, age: 20–73 years old) and in on-road urban environments (N = 241, 44% female, age: 19–74 years old). We analysed measures of eye gaze entropy in relation to driving experience and compared their dynamics between the simulator and on-road driving. The results demonstrate that, in both driving conditions, gaze transition entropy is higher, whereas stationary gaze entropy is lower, in more experienced drivers of both genders. This suggests that gaining driving experience may be accompanied by a decrease in overall gaze dispersion and an increased unpredictability of visual scanning behaviour. These results are in line with previously reported trends on experience-related dynamics of eye gaze entropy measures. We discuss our findings in the framework of the system-evolutionary theory, which explains the organisation of behaviour through the history of individual development, corresponding to the growing complexity of individual–environment interactions. Experience-related dynamics of eye gaze complexity can be a useful factor in the development of practical applications, such as driver monitoring systems and other human–machine interfaces. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Computational Neuroscience)
Show Figures

Figure 1

Back to TopTop