entropy-logo

Journal Browser

Journal Browser

Entropy and Information in Biological Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: closed (20 May 2024) | Viewed by 2622

Special Issue Editor


E-Mail Website
Guest Editor
Department of Physiology & Biophysics, University of Mississippi Medical Center, 2500 North State Street, Jackson, MS 39216, USA
Interests: systems physiology and theoretical biology

Special Issue Information

Dear Colleagues,

In 1943, Erwin Schrödinger proposed that an understanding of the true nature of living systems first requires an apprehension of their ability to control entropy dynamics within their environment. The development of information theory for communications by Claude Shannon was subsequently linked to the concept of entropy. Living organisms utilize and exchange information as a form of biological currency during the process of adapting to their environmental conditions. The mechanics of the flow of information in open living systems have not been deeply explored in the literature and deserve attention. The derivation of biological systems from the information/entropy perspective could provide considerable insights into the functioning and fundamental nature of entropy dynamics and provide a foundation for a comprehensive theoretical biology.

Prof. Dr. Richard Summers
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropy
  • information theory
  • entropy dynamics
  • biological systems
  • theoretical biology

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 673 KiB  
Article
Data-Driven Identification of Stroke through Machine Learning Applied to Complexity Metrics in Multimodal Electromyography and Kinematics
by Francesco Romano, Damiano Formenti, Daniela Cardone, Emanuele Francesco Russo, Paolo Castiglioni, Giampiero Merati, Arcangelo Merla and David Perpetuini
Entropy 2024, 26(7), 578; https://doi.org/10.3390/e26070578 (registering DOI) - 7 Jul 2024
Abstract
A stroke represents a significant medical condition characterized by the sudden interruption of blood flow to the brain, leading to cellular damage or death. The impact of stroke on individuals can vary from mild impairments to severe disability. Treatment for stroke often focuses [...] Read more.
A stroke represents a significant medical condition characterized by the sudden interruption of blood flow to the brain, leading to cellular damage or death. The impact of stroke on individuals can vary from mild impairments to severe disability. Treatment for stroke often focuses on gait rehabilitation. Notably, assessing muscle activation and kinematics patterns using electromyography (EMG) and stereophotogrammetry, respectively, during walking can provide information regarding pathological gait conditions. The concurrent measurement of EMG and kinematics can help in understanding disfunction in the contribution of specific muscles to different phases of gait. To this aim, complexity metrics (e.g., sample entropy; approximate entropy; spectral entropy) applied to EMG and kinematics have been demonstrated to be effective in identifying abnormal conditions. Moreover, the conditional entropy between EMG and kinematics can identify the relationship between gait data and muscle activation patterns. This study aims to utilize several machine learning classifiers to distinguish individuals with stroke from healthy controls based on kinematics and EMG complexity measures. The cubic support vector machine applied to EMG metrics delivered the best classification results reaching 99.85% of accuracy. This method could assist clinicians in monitoring the recovery of motor impairments for stroke patients. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

58 pages, 131141 KiB  
Article
Neural Activity in Quarks Language: Lattice Field Theory for a Network of Real Neurons
by Giampiero Bardella, Simone Franchini, Liming Pan, Riccardo Balzan, Surabhi Ramawat, Emiliano Brunamonti, Pierpaolo Pani and Stefano Ferraina
Entropy 2024, 26(6), 495; https://doi.org/10.3390/e26060495 - 6 Jun 2024
Viewed by 692
Abstract
Brain–computer interfaces have seen extraordinary surges in developments in recent years, and a significant discrepancy now exists between the abundance of available data and the limited headway made in achieving a unified theoretical framework. This discrepancy becomes particularly pronounced when examining the collective [...] Read more.
Brain–computer interfaces have seen extraordinary surges in developments in recent years, and a significant discrepancy now exists between the abundance of available data and the limited headway made in achieving a unified theoretical framework. This discrepancy becomes particularly pronounced when examining the collective neural activity at the micro and meso scale, where a coherent formalization that adequately describes neural interactions is still lacking. Here, we introduce a mathematical framework to analyze systems of natural neurons and interpret the related empirical observations in terms of lattice field theory, an established paradigm from theoretical particle physics and statistical mechanics. Our methods are tailored to interpret data from chronic neural interfaces, especially spike rasters from measurements of single neuron activity, and generalize the maximum entropy model for neural networks so that the time evolution of the system is also taken into account. This is obtained by bridging particle physics and neuroscience, paving the way for particle physics-inspired models of the neocortex. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

19 pages, 3690 KiB  
Article
Embedded Complexity of Evolutionary Sequences
by Jonathan D. Phillips
Entropy 2024, 26(6), 458; https://doi.org/10.3390/e26060458 - 28 May 2024
Viewed by 340
Abstract
Multiple pathways and outcomes are common in evolutionary sequences for biological and other environmental systems due to nonlinear complexity, historical contingency, and disturbances. From any starting point, multiple evolutionary pathways are possible. From an endpoint or observed state, multiple possibilities exist for the [...] Read more.
Multiple pathways and outcomes are common in evolutionary sequences for biological and other environmental systems due to nonlinear complexity, historical contingency, and disturbances. From any starting point, multiple evolutionary pathways are possible. From an endpoint or observed state, multiple possibilities exist for the sequence of events that created it. However, for any observed historical sequence—e.g., ecological or soil chronosequences, stratigraphic records, or lineages—only one historical sequence actually occurred. Here, a measure of the embedded complexity of historical sequences based on algebraic graph theory is introduced. Sequences are represented as system states S(t), such that S(t − 1) ≠ S(t) ≠ S(t + 1). Each sequence of N states contains nested subgraph sequences of length 2, 3, …, N − 1. The embedded complexity index (which can also be interpreted in terms of embedded information) compares the complexity (based on the spectral radius λ1) of the entire sequence to the cumulative complexity of the constituent subsequences. The spectral radius is closely linked to graph entropy, so the index also reflects information in the sequence. The analysis is also applied to ecological state-and-transition models (STM), which represent observed transitions, along with information on their causes or triggers. As historical sequences are lengthened (by the passage of time and additional transitions or by improved resolutions or new observations of historical changes), the overall complexity asymptotically approaches λ1 = 2, while the embedded complexity increases as N2.6. Four case studies are presented, representing coastal benthic community shifts determined from biostratigraphy, ecological succession on glacial forelands, vegetation community changes in longleaf pine woodlands, and habitat changes in a delta. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

11 pages, 2914 KiB  
Article
Entropic Dynamics of Mutations in SARS-CoV-2 Genomic Sequences
by Marco Favretti
Entropy 2024, 26(2), 163; https://doi.org/10.3390/e26020163 - 14 Feb 2024
Viewed by 945
Abstract
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every [...] Read more.
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every sequence which varies randomly, the case of SARS-CoV-2 RNA genome is particularly interesting to analyze, due to the richness of the available sequence database containing more than a million sequences. Our model is able to track known features of the mutation dynamics like the Cytosine–Thymine bias, but also to reveal new features of the virus mutation dynamics. We show that these new findings can be studied using an approach that combines the mean field approximation of a Markov dynamics within a stochastic thermodynamics framework. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

Back to TopTop