New Advances in Probabilistic Machine Learning and Bayesian Predictive Methods

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Probability and Statistics".

Deadline for manuscript submissions: closed (30 April 2024) | Viewed by 1462

Special Issue Editors

Business School, University of Edinburgh, Edinburgh EH8 9JS, UK
Interests: Gaussian processes; Bayesian non-parametrics; time series; human mobility; algorithmic fairness
School of Computing and Mathematical Sciences, University of Leicester, Leicester LE1 7RH, UK
Interests: functional data analysis; longitudinal data analysis; Gaussian process modelling; computational statistics; mortality modelling and forecasting

Special Issue Information

Dear Colleagues,

We are delighted to welcome you to this Special Issue of Mathematics focusing on “New Advances in Probabilistic Machine Learning and Bayesian Predictive Methods.” This Issue aims to showcase cutting-edge research and novel developments in the ever-evolving field of probabilistic machine learning, while highlighting the growing importance of Bayesian approaches in tackling complex, real-world problems.

In this Special Issue, we will explore a wide range of topics, including but not limited to scalable Bayesian inference algorithms, Bayesian deep learning, deep generative models, Gaussian processes, uncertainty quantification, computational methods for Bayesian inference, and Bayesian optimization. We aim to bring together a diverse array of interdisciplinary perspectives, fostering fruitful discussions and collaborations among researchers from academia and industry.

We cordially invite researchers to contribute their latest findings and insights to this Special Issue, pushing the boundaries of our understanding in this fascinating domain. To the readers, we hope that this Special Issue will serve as an invaluable resource for staying up-to-date with the most recent developments in probabilistic machine learning and Bayesian predictive methods.

We eagerly look forward to your contributions and engagement in this intellectual journey!

Dr. Zexun Chen
Dr. Bo Wang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • scalable Bayesian inference
  • Gaussian process
  • Bayesian optimization
  • uncertainty quantification
  • probabilistic machine learning
  • non-parametric Bayesian modelling

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 2186 KiB  
Article
Bayesian Network Structure Learning Using Improved A* with Constraints from Potential Optimal Parent Sets
by Chuchao He, Ruohai Di and Xiangyuan Tan
Mathematics 2023, 11(15), 3344; https://doi.org/10.3390/math11153344 - 30 Jul 2023
Viewed by 914
Abstract
Learning the structure of a Bayesian network and considering the efficiency and accuracy of learning has always been a hot topic for researchers. This paper proposes two constraints to solve the problem that the A* algorithm, an exact learning algorithm, is not efficient [...] Read more.
Learning the structure of a Bayesian network and considering the efficiency and accuracy of learning has always been a hot topic for researchers. This paper proposes two constraints to solve the problem that the A* algorithm, an exact learning algorithm, is not efficient enough to search larger networks. On the one hand, the parent–child set constraints reduce the number of potential optimal parent sets. On the other hand, the path constraints are obtained from the potential optimal parent sets to constrain the search process of the A* algorithm. Both constraints are proposed based on the potential optimal parent sets. Experiments show that the time efficiency of the A* algorithm can be significantly improved, and the ability of the A* algorithm to search larger Bayesian networks can be improved by the two constraints. In addition, compared with the globally optimal Bayesian network learning using integer linear programming (GOBNILP) algorithm and the max–min hill-climbing (MMHC) algorithm, which are state of the art, the A* algorithm enhanced by constraints still performs well in most cases. Full article
Show Figures

Figure 1

Back to TopTop