*2.7. Shannon Entropy*

Shannon entropy (see Shannon [16]) measures the amount of uncertainty for a random variable. It is defined as:

$$\mathbb{S}(Z) = -\mathbb{E}\left(\log f(Z)\right).$$

Therefore, it can be checked that the Shannon entropy for the GTPN model is

$$S(Z) = \log\left(\frac{\sqrt{2\pi\sigma\Phi(\lambda)}}{\mathfrak{a}}\right) - (\mathfrak{a} - 1)\mathbb{E}\left(\log(Z)\right) + \frac{\mu^{2a}}{2\sigma^{2a}} - \frac{\lambda\mu^a}{\sigma^a} + \frac{\lambda^2}{2},$$

Figure 5 shows the entropy curve for the GTPN(*σ* = 1.5, *λ*, *α*) model, considering different values for *λ* and *α*. We note that this function is increasing in *λ* and *α*. where E (log(*Z*)) = ∞0 log(*z*)*f*(*z*; *σ*, *λ*, *<sup>α</sup>*)*dz*. For *α* = 1, the Shannon entropy is reduced to

$$S(Z) = \log\left(\sqrt{2\pi\sigma\Phi(\lambda)}\right) + \frac{\mu^2}{2\sigma^2} - \frac{\lambda\mu}{\sigma} + \frac{\lambda^2}{2}\lambda$$

which corresponds to the Shannon entropy for the TPN model; and for *α* = 1 and *λ* = 0, *S*(*Z*) is reduced to

$$\mathcal{S}(Z) = \log\left(\sqrt{\pi\sigma}\right) + \frac{\mu^2}{2\sigma^2}\sigma$$

which corresponds to the Shannon entropy for the HN distribution.

**Figure 5.** Entropy for the *GTPN*(*σ* = 1, *α*, *λ*) model.
