**2. Multivariate Cauchy Distribution and Kullback–Leibler Divergence**

Let **X** be a random vector of R*<sup>p</sup>* which follows the MCD, characterized by the following probability density function (pdf) given as follows [2]

$$f\_{\mathbf{X}}(\mathbf{x}|\boldsymbol{\mu}, \boldsymbol{\Sigma}, p) = \frac{\Gamma(\frac{1+p}{2})}{\pi^{\frac{p}{2}}\Gamma(\frac{1}{2})} \frac{1}{|\boldsymbol{\Sigma}|^{\frac{1}{2}}} \frac{1}{[1 + (\boldsymbol{x} - \boldsymbol{\mu})^{T}\boldsymbol{\Sigma}^{-1}(\boldsymbol{x} - \boldsymbol{\mu})]^{\frac{1+p}{2}}}.\tag{1}$$

This is for any *<sup>x</sup>* <sup>∈</sup> <sup>R</sup>*p*, where *<sup>p</sup>* is the dimensionality of the sample space, *<sup>μ</sup>* is the location vector, **Σ** is a symmetric, positive definite (*p* × *p*) scale matrix and Γ(.) is the Gamma function. Let **X**<sup>1</sup> and **X**<sup>2</sup> be two random vectors that follow central MCDs with pdfs *f* **<sup>X</sup>**<sup>1</sup> (*x*|**Σ**1, *p*) = *f* **<sup>X</sup>**<sup>1</sup> (*x*|0, **Σ**1, *p*) and *f* **<sup>X</sup>**<sup>2</sup> (*x*|**Σ**2, *p*) = *f* **<sup>X</sup>**<sup>2</sup> (*x*|0, **Σ**2, *p*) given by (1). KLD provides an asymmetric measure of the similarity of the two pdfs. Indeed, the KLD between the two central MCDs is given by

$$\text{KL}(\mathbf{X}^1||\mathbf{X}^2) = \int\_{\mathbb{R}^p} \ln \left( \frac{f\_{\mathbf{X}^1}(\mathbf{x}|\boldsymbol{\Sigma}\_{1\prime}p)}{f\_{\mathbf{X}^2}(\mathbf{x}|\boldsymbol{\Sigma}\_{2\prime}p)} \right) f\_{\mathbf{X}^1}(\mathbf{x}|\boldsymbol{\Sigma}\_{1\prime}p) \text{d}\mathbf{x} \tag{2}$$

$$\mathcal{I} = E\_{\mathbf{X}^{\mathbb{I}}} \{ \ln f\_{\mathbf{X}^{\mathbb{I}}}(\mathbf{X}) \} - E\_{\mathbf{X}^{\mathbb{I}}} \{ \ln f\_{\mathbf{X}^{\mathbb{2}}}(\mathbf{X}) \}. \tag{3}$$

Since the KLD is the relative entropy defined as the difference between the cross-entropy and the entropy, we have the following relation:

$$\text{KL}(\mathbf{X}^1||\mathbf{X}^2) = H(f\_{\mathbf{X}^1}, f\_{\mathbf{X}^2}) - H(f\_{\mathbf{X}^1}) \tag{4}$$

where *H*(*f* **<sup>X</sup>**<sup>1</sup> , *f* **<sup>X</sup>**<sup>2</sup> ) = −*E***X**<sup>1</sup> {ln *f* **<sup>X</sup>**<sup>2</sup> (**X**)} denotes the cross-entropy and *H*(*f* **<sup>X</sup>**<sup>1</sup> ) = −*E***X**<sup>1</sup> {ln *f* **<sup>X</sup>**<sup>1</sup> (**X**)} the entropy. Therefore, the determination of KLD requires the expression of the entropy and the cross-entropy. It should be noted that the smaller KL(**X**1||**X**2), the more similar are *f* **<sup>X</sup>**<sup>1</sup> (*x*|**Σ**1, *p*) and *f* **<sup>X</sup>**<sup>2</sup> (*x*|**Σ**2, *<sup>p</sup>*). The symmetric KL similarity measure between **<sup>X</sup>**<sup>1</sup> and **<sup>X</sup>**<sup>2</sup> is *<sup>d</sup>*KL(**X**1, **<sup>X</sup>**2) = KL(**X**1||**X**2) + KL(**X**2||**X**1). In order to compute the KLD, we have to derive the analytical expressions of *E***X**<sup>1</sup> {ln *f* **<sup>X</sup>**<sup>1</sup> (**X**)} and *E***X**<sup>1</sup> {ln *f* **<sup>X</sup>**<sup>2</sup> (**X**)} which depend, respectively, on *<sup>E</sup>***X**<sup>1</sup> {ln[<sup>1</sup> <sup>+</sup> **<sup>X</sup>***T***Σ**−<sup>1</sup> <sup>1</sup> **<sup>X</sup>**]} and *<sup>E</sup>***X**<sup>1</sup> {ln[<sup>1</sup> <sup>+</sup> **<sup>X</sup>***T***Σ**−<sup>1</sup> <sup>2</sup> **X**]}. Consequently, the closed-form expression of the KLD between two zero-mean MCDs is given by

$$\text{KL}(\mathbf{X}^{1}||\mathbf{X}^{2}) = \frac{1}{2}\log\frac{|\boldsymbol{\Sigma}\_{2}|}{|\boldsymbol{\Sigma}\_{1}|} - \frac{1+p}{2}\left(E\_{\mathbf{X}^{1}}\{\ln[1+\mathbf{X}^{T}\boldsymbol{\Sigma}\_{1}^{-1}\mathbf{X}]\} - E\_{\mathbf{X}^{1}}\{\ln[1+\mathbf{X}^{T}\boldsymbol{\Sigma}\_{2}^{-1}\mathbf{X}]\}\right). \tag{5}$$

To provide the expression of these two expectations, some tools based on the multiple power series are required. The next section presents some definitions and propositions used for this goal.
