Reprint

Information Theory in Neuroscience

Edited by
March 2019
280 pages
  • ISBN978-3-03897-664-6 (Paperback)
  • ISBN978-3-03897-665-3 (PDF)

This book is a reprint of the Special Issue Information Theory in Neuroscience that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

As the ultimate information processing device, the brain naturally lends itself to being studied with information theory. The application of information theory to neuroscience has spurred the development of principled theories of brain function, and has led to advances in the study of consciousness, as well as to the development of analytical techniques to crack the neural code—that is, to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling the precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative testing of hypotheses about how the brain encodes and transmits the information used for specific functions across areas.

This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience.

Format
  • Paperback
License
© 2019 by the authors; CC BY-NC-ND license
Keywords
neural network; Potts model; latching; recursion; functional connectome; graph theoretical analysis; eigenvector centrality; orderness; network eigen-entropy; information entropy production; discrete Markov chains; spike train statistics; Gibbs measures; maximum entropy principle; pulse-gating; channel capacity; neural coding; feedforward networks; neural information propagation; information theory; mutual information decomposition; synergy; redundancy; integrated information theory; integrated information; minimum information partition; submodularity; Queyranne’s algorithm; consciousness; maximum entropy; higher-order correlations; neural population coding; Ising model; brain network; complex networks; connectome; information theory; graph theory; free-energy principle; internal model hypothesis; unconscious inference; infomax principle; independent component analysis; principal component analysis; goodness; categorical perception; perceptual magnet; information theory; perceived similarity; mutual information; synergy; redundancy; neural code; hippocampus; entorhinal cortex; navigation; neural code; representation; decoding; spike-time precision; discrimination; noise correlations; information theory; mismatched decoding; information theory; neuroscience