entropy-logo

Journal Browser

Journal Browser

Bayesian Inference and Mathematical Modeling in Complex Biological Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: 15 June 2025 | Viewed by 4298

Special Issue Editor


E-Mail Website
Guest Editor
Department of Applied Mathematics and Computer Science, Technical University of Denmark, 2800 Kongens Lyngby, Denmark
Interests: machine learning; data science; complex networks; non-parametric Bayesian inference; neuroimaging

Special Issue Information

Dear Colleagues,

Bayesian inference provides a principled foundation for the modeling of biological systems accounting for uncertainty. In particular, Bayesian modeling procedures can provide means of imposing prior knowledge and by explicitly accounting for noise and parameter uncertainty provide added robustness when compared to conventional maximum likelihood-based estimation procedures. Furthermore, Bayesian inference provides principled tools for model assessments guiding model selection and checking. 

The aim of this Special Issue is to highlight the use of Bayesian inference for the modeling of complex biological systems. Data pertaining to complex biological systems are typically noisy and the parameters of the mathematical models used to characterize these systems are subject to uncertainty. This Special Issue will highlight how mathematical modeling procedures endowed uncertainty quantification through Bayesian inference procedures can provide important tools in order to characterize the structure and further our understanding of complex biological systems in the face of uncertainty. 

Mathematical models of complex biological systems include matrix and tensor factorization-based techniques, statistical network modeling approaches, simulation models, and deep learning methodologies for the characterization of complex biological systems. Furthermore, complex biological datasets may only be partially observed, heterogeneous and include multiple sources of information to be combined. This Special Issue will highlight how Bayesian inference procedures is useful including in these disparate contexts of mathematical modeling of complex biological systems.

The topics of this Special Issue include but are not limited to:

  • Bayesian inference in matrix and tensor-based decomposition methods for biological data;
  • Bayesian inference for the modeling of complex biological networks;
  • Bayesian inference for data fusion of complex biological data;
  • Bayesian deep learning for the modeling of biological data;
  • Bayesian model assessment in biological data modeling;
  • Bayesian optimization in complex biological systems.

Prof. Dr. Morten Mørup
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 5027 KiB  
Article
Ornstein–Uhlenbeck Adaptation as a Mechanism for Learning in Brains and Machines
by Jesús García Fernández, Nasir Ahmad and Marcel van Gerven
Entropy 2024, 26(12), 1125; https://doi.org/10.3390/e26121125 - 22 Dec 2024
Viewed by 1041
Abstract
Learning is a fundamental property of intelligent systems, observed across biological organisms and engineered systems. While modern intelligent systems typically rely on gradient descent for learning, the need for exact gradients and complex information flow makes its implementation in biological and neuromorphic systems [...] Read more.
Learning is a fundamental property of intelligent systems, observed across biological organisms and engineered systems. While modern intelligent systems typically rely on gradient descent for learning, the need for exact gradients and complex information flow makes its implementation in biological and neuromorphic systems challenging. This has motivated the exploration of alternative learning mechanisms that can operate locally and do not rely on exact gradients. In this work, we introduce a novel approach that leverages noise in the parameters of the system and global reinforcement signals. Using an Ornstein–Uhlenbeck process with adaptive dynamics, our method balances exploration and exploitation during learning, driven by deviations from error predictions, akin to reward prediction error. Operating in continuous time, Ornstein–Uhlenbeck adaptation (OUA) is proposed as a general mechanism for learning in dynamic, time-evolving environments. We validate our approach across a range of different tasks, including supervised learning and reinforcement learning in feedforward and recurrent systems. Additionally, we demonstrate that it can perform meta-learning, adjusting hyper-parameters autonomously. Our results indicate that OUA provides a promising alternative to traditional gradient-based methods, with potential applications in neuromorphic computing. It also hints at a possible mechanism for noise-driven learning in the brain, where stochastic neurotransmitter release may guide synaptic adjustments. Full article
Show Figures

Figure 1

24 pages, 31769 KiB  
Article
Probabilistic PARAFAC2
by Philip J. H. Jørgensen, Søren F. Nielsen, Jesper L. Hinrich, Mikkel N. Schmidt, Kristoffer H. Madsen and Morten Mørup
Entropy 2024, 26(8), 697; https://doi.org/10.3390/e26080697 - 17 Aug 2024
Cited by 2 | Viewed by 1000
Abstract
The Parallel Factor Analysis 2 (PARAFAC2) is a multimodal factor analysis model suitable for analyzing multi-way data when one of the modes has incomparable observation units, for example, because of differences in signal sampling or batch sizes. A fully probabilistic treatment of the [...] Read more.
The Parallel Factor Analysis 2 (PARAFAC2) is a multimodal factor analysis model suitable for analyzing multi-way data when one of the modes has incomparable observation units, for example, because of differences in signal sampling or batch sizes. A fully probabilistic treatment of the PARAFAC2 is desirable to improve robustness to noise and provide a principled approach for determining the number of factors, but challenging because direct model fitting requires that factor loadings be decomposed into a shared matrix specifying how the components are consistently co-expressed across samples and sample-specific orthogonality-constrained component profiles. We develop two probabilistic formulations of the PARAFAC2 model along with variational Bayesian procedures for inference: In the first approach, the mean values of the factor loadings are orthogonal leading to closed form variational updates, and in the second, the factor loadings themselves are orthogonal using a matrix Von Mises–Fisher distribution. We contrast our probabilistic formulations to the conventional direct fitting algorithm based on maximum likelihood on synthetic data and real fluorescence spectroscopy and gas chromatography–mass spectrometry data showing that the probabilistic formulations are more robust to noise and model order misspecification. The probabilistic PARAFAC2, thus, forms a promising framework for modeling multi-way data accounting for uncertainty. Full article
Show Figures

Figure 1

27 pages, 5652 KiB  
Article
Robust Inference of Dynamic Covariance Using Wishart Processes and Sequential Monte Carlo
by Hester Huijsdens, David Leeftink, Linda Geerligs and Max Hinne
Entropy 2024, 26(8), 695; https://doi.org/10.3390/e26080695 - 16 Aug 2024
Viewed by 1245
Abstract
Several disciplines, such as econometrics, neuroscience, and computational psychology, study the dynamic interactions between variables over time. A Bayesian nonparametric model known as the Wishart process has been shown to be effective in this situation, but its inference remains highly challenging. In this [...] Read more.
Several disciplines, such as econometrics, neuroscience, and computational psychology, study the dynamic interactions between variables over time. A Bayesian nonparametric model known as the Wishart process has been shown to be effective in this situation, but its inference remains highly challenging. In this work, we introduce a Sequential Monte Carlo (SMC) sampler for the Wishart process, and show how it compares to conventional inference approaches, namely MCMC and variational inference. Using simulations, we show that SMC sampling results in the most robust estimates and out-of-sample predictions of dynamic covariance. SMC especially outperforms the alternative approaches when using composite covariance functions with correlated parameters. We further demonstrate the practical applicability of our proposed approach on a dataset of clinical depression (n=1), and show how using an accurate representation of the posterior distribution can be used to test for dynamics in covariance. Full article
Show Figures

Figure 1

Back to TopTop