Reprint

Approximate Bayesian Inference

Edited by
May 2022
508 pages
  • ISBN978-3-0365-3789-4 (Hardback)
  • ISBN978-3-0365-3790-0 (PDF)

This book is a reprint of the Special Issue Approximate Bayesian Inference that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.

Format
  • Hardback
License
© 2022 by the authors; CC BY-NC-ND license
Keywords
bifurcation; dynamical systems; Edward–Sokal coupling; mean-field; Kullback–Leibler divergence; variational inference; Bayesian statistics; machine learning; variational approximations; PAC-Bayes; expectation-propagation; Markov chain Monte Carlo; Langevin Monte Carlo; sequential Monte Carlo; Laplace approximations; approximate Bayesian computation; Gibbs posterior; variational inference; MCMC; stochastic gradients; neural networks; Approximate Bayesian Computation; differential evolution; MCMC; Markov kernels; discrete state space; ergodicity; Markov chain; probably approximately correct; variational Bayes; Bayesian inference; Markov Chain Monte Carlo; Sequential Monte Carlo; Riemann Manifold Hamiltonian Monte Carlo; integrated nested laplace approximation; fixed-form variational Bayes; stochastic volatility; network modeling; network variability; Stiefel manifold; MCMC-SAEM; data imputation; Bayesian inference; Bethe free energy; factor graphs; message passing; variational free energy; variational inference; variational message passing; approximate Bayesian computation (ABC); differential privacy (DP); sparse vector technique (SVT); variational inference; Gaussian; particle flow; variable flow; Markov Chain Monte Carlo; Langevin dynamics; Hamilton Monte Carlo; non-reversible dynamics; control variates; Markov chain Monte Carlo; thinning; meta-learning; hyperparameters; priors; online learning; Bayesian inference; online optimization; gradient descent; statistical learning theory; PAC–Bayes theory; deep learning; statistical learning theory; PAC-Bayes; generalisation bounds; Bayesian sampling; stochastic gradients; Monte Carlo integration; statistical learning theory; PAC-Bayes theory; no free lunch theorems; sequential learning; principal curves; data streams; regret bounds; greedy algorithm; sleeping experts; machine learning; entropy; robustness; statistical mechanics; complex systems