entropy-logo

Journal Browser

Journal Browser

Neural Dynamics and Information Processing

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (28 April 2023) | Viewed by 7526

Special Issue Editors

Institute for Adaptive & Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
Interests: computational neuroscience; machine learning; information theory; statistics

E-Mail Website
Guest Editor
Institute for Adaptive & Neural Computation, School of Informatics, University of Edinburgh, Edinburgh EH8 9AB, UK
Interests: theoretical biology; computational neuroscience; nonlinear dynamics

Special Issue Information

Dear Colleagues,

Information theory has emerged as a framework for investigating how neural populations operate. However, it is difficult to scale information-theoretic methods to large neural populations or long recordings. At the same time, progress in recording techniques yields ever-increasing numbers of simultaneously recorded neurons, providing rich datasets for the analysis of underlying neural population dynamics. New modeling and methodological approaches are developed to investigate neural dynamics and exploit the advantages of information-theoretic measures in this context, dealing with the challenges of computational and sample complexities.

This Special Issue is concerned with the latest contributions to modeling neural dynamics, in conjunction with approaches from information theory, which are applied to tuning and interpreting dynamical models. We also invite research at the intersection between machine learning and neuroscience aiming to scale dynamics or information theory approaches to larger scales.

Dr. Arno Onken
Dr. Nina Kudryashova
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

8 pages, 322 KiB  
Article
Estimating Mutual Information for Spike Trains: A Bird Song Example
by Jake Witter and Conor Houghton
Entropy 2023, 25(10), 1413; https://doi.org/10.3390/e25101413 - 3 Oct 2023
Viewed by 1002
Abstract
Zebra finches are a model animal used in the study of audition. They are adept at recognizing zebra finch songs, and the neural pathway involved in song recognition is well studied. Here, this example is used to illustrate the estimation of mutual information [...] Read more.
Zebra finches are a model animal used in the study of audition. They are adept at recognizing zebra finch songs, and the neural pathway involved in song recognition is well studied. Here, this example is used to illustrate the estimation of mutual information between stimuli and responses using a Kozachenko–Leonenko estimator. The challenge in calculating mutual information for spike trains is that there are no obvious coordinates for the data. The Kozachenko–Leonenko estimator does not require coordinates; it relies only on the distance between data points. In the case of bird songs, estimating the mutual information demonstrates that the information content of spiking does not diminish as the song progresses. Full article
(This article belongs to the Special Issue Neural Dynamics and Information Processing)
Show Figures

Figure 1

16 pages, 5857 KiB  
Article
Discovering Low-Dimensional Descriptions of Multineuronal Dependencies
by Lazaros Mitskopoulos and Arno Onken
Entropy 2023, 25(7), 1026; https://doi.org/10.3390/e25071026 - 6 Jul 2023
Viewed by 1806
Abstract
Coordinated activity in neural populations is crucial for information processing. Shedding light on the multivariate dependencies that shape multineuronal responses is important to understand neural codes. However, existing approaches based on pairwise linear correlations are inadequate at capturing complicated interaction patterns and miss [...] Read more.
Coordinated activity in neural populations is crucial for information processing. Shedding light on the multivariate dependencies that shape multineuronal responses is important to understand neural codes. However, existing approaches based on pairwise linear correlations are inadequate at capturing complicated interaction patterns and miss features that shape aspects of the population function. Copula-based approaches address these shortcomings by extracting the dependence structures in the joint probability distribution of population responses. In this study, we aimed to dissect neural dependencies with a C-Vine copula approach coupled with normalizing flows for estimating copula densities. While this approach allows for more flexibility compared to fitting parametric copulas, drawing insights on the significance of these dependencies from large sets of copula densities is challenging. To alleviate this challenge, we used a weighted non-negative matrix factorization procedure to leverage shared latent features in neural population dependencies. We validated the method on simulated data and applied it on copulas we extracted from recordings of neurons in the mouse visual cortex as well as in the macaque motor cortex. Our findings reveal that neural dependencies occupy low-dimensional subspaces, but distinct modules are synergistically combined to give rise to diverse interaction patterns that may serve the population function. Full article
(This article belongs to the Special Issue Neural Dynamics and Information Processing)
Show Figures

Figure 1

21 pages, 4490 KiB  
Article
Information Encoding in Bursting Spiking Neural Network Modulated by Astrocytes
by Sergey V. Stasenko and Victor B. Kazantsev
Entropy 2023, 25(5), 745; https://doi.org/10.3390/e25050745 - 1 May 2023
Cited by 6 | Viewed by 2592
Abstract
We investigated a mathematical model composed of a spiking neural network (SNN) interacting with astrocytes. We analysed how information content in the form of two-dimensional images can be represented by an SNN in the form of a spatiotemporal spiking pattern. The SNN includes [...] Read more.
We investigated a mathematical model composed of a spiking neural network (SNN) interacting with astrocytes. We analysed how information content in the form of two-dimensional images can be represented by an SNN in the form of a spatiotemporal spiking pattern. The SNN includes excitatory and inhibitory neurons in some proportion, sustaining the excitation–inhibition balance of autonomous firing. The astrocytes accompanying each excitatory synapse provide a slow modulation of synaptic transmission strength. An information image was uploaded to the network in the form of excitatory stimulation pulses distributed in time reproducing the shape of the image. We found that astrocytic modulation prevented stimulation-induced SNN hyperexcitation and non-periodic bursting activity. Such homeostatic astrocytic regulation of neuronal activity makes it possible to restore the image supplied during stimulation and lost in the raster diagram of neuronal activity due to non-periodic neuronal firing. At a biological point, our model shows that astrocytes can act as an additional adaptive mechanism for regulating neural activity, which is crucial for sensory cortical representations. Full article
(This article belongs to the Special Issue Neural Dynamics and Information Processing)
Show Figures

Figure 1

18 pages, 4202 KiB  
Article
Synchrony-Division Neural Multiplexing: An Encoding Model
by Mohammad R. Rezaei, Reza Saadati Fard, Milos R. Popovic, Steven A. Prescott and Milad Lankarany
Entropy 2023, 25(4), 589; https://doi.org/10.3390/e25040589 - 30 Mar 2023
Viewed by 1468
Abstract
Cortical neurons receive mixed information from the collective spiking activities of primary sensory neurons in response to a sensory stimulus. A recent study demonstrated an abrupt increase or decrease in stimulus intensity and the stimulus intensity itself can be respectively represented by the [...] Read more.
Cortical neurons receive mixed information from the collective spiking activities of primary sensory neurons in response to a sensory stimulus. A recent study demonstrated an abrupt increase or decrease in stimulus intensity and the stimulus intensity itself can be respectively represented by the synchronous and asynchronous spikes of S1 neurons in rats. This evidence capitalized on the ability of an ensemble of homogeneous neurons to multiplex, a coding strategy that was referred to as synchrony-division multiplexing (SDM). Although neural multiplexing can be conceived by distinct functions of individual neurons in a heterogeneous neural ensemble, the extent to which nearly identical neurons in a homogeneous neural ensemble encode multiple features of a mixed stimulus remains unknown. Here, we present a computational framework to provide a system-level understanding on how an ensemble of homogeneous neurons enable SDM. First, we simulate SDM with an ensemble of homogeneous conductance-based model neurons receiving a mixed stimulus comprising slow and fast features. Using feature-estimation techniques, we show that both features of the stimulus can be inferred from the generated spikes. Second, we utilize linear nonlinear (LNL) cascade models and calculate temporal filters and static nonlinearities of differentially synchronized spikes. We demonstrate that these filters and nonlinearities are distinct for synchronous and asynchronous spikes. Finally, we develop an augmented LNL cascade model as an encoding model for the SDM by combining individual LNLs calculated for each type of spike. The augmented LNL model reveals that a homogeneous neural ensemble model can perform two different functions, namely, temporal- and rate-coding, simultaneously. Full article
(This article belongs to the Special Issue Neural Dynamics and Information Processing)
Show Figures

Figure 1

Back to TopTop