Next Article in Journal
Particle Swarm Optimization-Based Unconstrained Polygonal Fitting of 2D Shapes
Next Article in Special Issue
What Is a Causal Graph?
Previous Article in Journal
A New Approach to Identifying Sorghum Hybrids Using UAV Imagery Using Multispectral Signature and Machine Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation

Istituto Dalle Molle di Studi Sull’Intelligenza Artificiale (IDSIA), 6900 Lugano, Switzerland
Algorithms 2024, 17(1), 24; https://doi.org/10.3390/a17010024
Submission received: 29 November 2023 / Revised: 31 December 2023 / Accepted: 4 January 2024 / Published: 6 January 2024
(This article belongs to the Special Issue Bayesian Networks and Causal Reasoning)

Abstract

Bayesian networks (BNs) are a foundational model in machine learning and causal inference. Their graphical structure can handle high-dimensional problems, divide them into a sparse collection of smaller ones, underlies Judea Pearl’s causality, and determines their explainability and interpretability. Despite their popularity, there are almost no resources in the literature on how to compute Shannon’s entropy and the Kullback–Leibler (KL) divergence for BNs under their most common distributional assumptions. In this paper, we provide computationally efficient algorithms for both by leveraging BNs’ graphical structure, and we illustrate them with a complete set of numerical examples. In the process, we show it is possible to reduce the computational complexity of KL from cubic to quadratic for Gaussian BNs.
Keywords: Bayesian networks; Shannon entropy; Kullback–Leibler divergence Bayesian networks; Shannon entropy; Kullback–Leibler divergence

Share and Cite

MDPI and ACS Style

Scutari, M. Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation. Algorithms 2024, 17, 24. https://doi.org/10.3390/a17010024

AMA Style

Scutari M. Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation. Algorithms. 2024; 17(1):24. https://doi.org/10.3390/a17010024

Chicago/Turabian Style

Scutari, Marco. 2024. "Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation" Algorithms 17, no. 1: 24. https://doi.org/10.3390/a17010024

APA Style

Scutari, M. (2024). Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation. Algorithms, 17(1), 24. https://doi.org/10.3390/a17010024

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop