*Article* **A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions**

**Nizar Bouhlel 1,\* and David Rousseau <sup>2</sup>**


**Abstract:** This paper introduces a closed-form expression for the Kullback–Leibler divergence (KLD) between two central multivariate Cauchy distributions (MCDs) which have been recently used in different signal and image processing applications where non-Gaussian models are needed. In this overview, the MCDs are surveyed and some new results and properties are derived and discussed for the KLD. In addition, the KLD for MCDs is showed to be written as a function of Lauricella D-hypergeometric series *<sup>F</sup>*(*p*) *<sup>D</sup>* . Finally, a comparison is made between the Monte Carlo sampling method to approximate the KLD and the numerical value of the closed-form expression of the latter. The approximation of the KLD by Monte Carlo sampling method are shown to converge to its theoretical value when the number of samples goes to the infinity.

**Keywords:** Multivariate Cauchy distribution (MCD); Kullback–Leibler divergence (KLD); multiple power series; Lauricella D-hypergeometric series
