Average Entropy of Gaussian Mixtures
Abstract
:1. Introduction
1.1. Gaussian Mixtures
1.2. Related Work
1.3. Contributions
2. Deriving the Series Expansion
2.1. Notation
2.2. Starting Point
- We first obtain a more compact form of in Corollary 1 via a change in variables in order to write in terms of the more familiar expectation with regard to a multivariate Gaussian density.
- We next perform another change or variables to obtain a diagonal form in the expectation with regard to the multivariate Gaussian. This is important since our new variables will be independent, and many expressions will be simplified if our variables are not correlated.
- We evaluate the leading terms to , as shown in Theorem 2, where we obtain an expression that separates the leading contributions to and the quantity S to be defined. The reason this is important is that the leading contributions contain terms that will make the Taylor series diverge, and therefore, we evaluate them analytically before performing the Taylor series expansion. We also include other terms that do not cause any problems in the limit of small for convenience. The remaining expression S is then safe to expand for small .
- We perform a third change in variables to simplify S, and we obtain a Taylor series for up to order in Theorem 3 via a brute-force approach. The change in variables is necessary for making the analytical expression tractable.
- Finally, we provide a determinant-based approach to evaluate the power series for S, and we obtain a result for up to order .
2.3. First Change in Variables
2.4. Diagonalization of and Second Change in Variables
2.5. Pulling Leading-Order Terms Out of the Integral
2.6. Third Change in Variables: Simplification
3. Brute Force Expansion
- What helps in this exercise is that odd powers of , , and lead to vanishing integrals.
- Furthermore, it is clear from the start that the odd powers of will disappear in the end. This can be seen from the fact that a factor in T and always occurs with an isolated ; hence, any occurrence of an odd power of comes with an odd power of .
- Due to the n-dimensional inner products that occur in and T, which consist of n independent terms, each power of is associated with a factor n. Consequently, the power series becomes a series not in but actually in . For convergence, the product needs to be sufficiently small. Fortunately, we are allowed to work under the condition , as explained in Section 2.2.
4. Determinant-Based Approach
5. Discussion
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Proof of Lemma 1
Appendix B. Proof of Lemma 2
Appendix C. Proof of Theorem 1
Appendix D. Proof of Lemma 3
Appendix E. Proof of Theorem 2
Appendix F. Proof of Lemma 4
Appendix G. Proof of Lemma 5
Appendix H. Proof of Lemma 6
Appendix I. Proof of Theorem 3
References
- Zhu, H.; Guo, R.; Shen, J.; Liu, J.; Liu, C.; Xue, X.X.; Zhang, L.; Mao, S. The Local Dark Matter Kinematic Substructure Based on LAMOST K Giants. arXiv 2024, arXiv:2404.19655. [Google Scholar]
- Turner, W.; Martini, P.; Karaçaylı, N.G.; Aguilar, J.; Ahlen, S.; Brooks, D.; Claybaugh, T.; de la Macorra, A.; Dey, A.; Doel, P.; et al. New measurements of the Lyman-α forest continuum and effective optical depth with LyCAN and DESI Y1 data. arXiv 2024, arXiv:2405.06743. [Google Scholar]
- Wu, Y.; Chen, M.; Li, Z.; Wang, M.; Wei, Y. Theoretical insights for diffusion guidance: A case study for gaussian mixture models. arXiv 2024, arXiv:2403.01639. [Google Scholar]
- Ho, J.; Jain, A.; Abbeel, P. Denoising diffusion probabilistic models. Adv. Neural Inf. Process. Syst. 2020, 33, 6840–6851. [Google Scholar]
- Sulam, J.; Romano, Y.; Elad, M. Gaussian mixture diffusion. In Proceedings of the 2016 IEEE International Conference on the Science of Electrical Engineering (ICSEE), Eilat, Israel, 16–18 November 2016; pp. 1–5. [Google Scholar]
- Guo, H.; Lu, C.; Bao, F.; Pang, T.; Yan, S.; Du, C.; Li, C. Gaussian Mixture Solvers for Diffusion Models. Adv. Neural Inf. Process. Syst. 2024, 36. [Google Scholar]
- Turan, N.; Böck, B.; Chan, K.J.; Fesl, B.; Burmeister, F.; Joham, M.; Fettweis, G.; Utschick, W. Wireless Channel Prediction via Gaussian Mixture Models. arXiv 2024, arXiv:2402.08351. [Google Scholar]
- Parmar, A.; Shah, K.; Captain, K.; López-Benítez, M.; Patel, J. Gaussian Mixture Model Based Anomaly Detection for Defense Against Byzantine Attack in Cooperative Spectrum Sensing. IEEE Trans. Cogn. Commun. Netw. 2023, 10, 499–509. [Google Scholar] [CrossRef]
- Qiu, X.; Jiang, T.; Wu, S.; Hayes, M. Physical layer authentication enhancement using a Gaussian mixture model. IEEE Access 2018, 6, 53583–53592. [Google Scholar] [CrossRef]
- McNicholas, P.D.; Murphy, T.B. Model-based clustering of microarray expression data via latent Gaussian mixture models. Bioinformatics 2010, 26, 2705–2712. [Google Scholar] [CrossRef] [PubMed]
- Toh, H.; Horimoto, K. Inference of a genetic network by a combined approach of cluster analysis and graphical Gaussian modeling. Bioinformatics 2002, 18, 287–297. [Google Scholar] [CrossRef] [PubMed]
- Raymond, N.; Iouchtchenko, D.; Roy, P.N.; Nooijen, M. A path integral methodology for obtaining thermodynamic properties of nonadiabatic systems using Gaussian mixture distributions. J. Chem. Phys. 2018, 148, 194110. [Google Scholar] [CrossRef] [PubMed]
- Sohl-Dickstein, J.; Weiss, E.; Maheswaranathan, N.; Ganguli, S. Deep unsupervised learning using nonequilibrium thermodynamics. In Proceedings of the International Conference on Machine Learning, PMLR, Lille, France, 7–9 July 2015; pp. 2256–2265. [Google Scholar]
- Cover, T.; Thomas, J. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 1999. [Google Scholar]
- Michalowicz, J.V.; Nichols, J.M.; Bucholtz, F. Calculation of differential entropy for a mixed Gaussian distribution. Entropy 2008, 10, 200. [Google Scholar] [CrossRef]
- Nielsen, F.; Sun, K. Guaranteed bounds on the Kullback–Leibler divergence of univariate mixtures. IEEE Signal Process. Lett. 2016, 23, 1543–1546. [Google Scholar] [CrossRef]
- Nielsen, F.; Nock, R. A series of maximum entropy upper bounds of the differential entropy. arXiv 2016, arXiv:1612.02954. [Google Scholar]
- Hershey, J.R.; Olsen, P.A. Approximating the Kullback Leibler divergence between Gaussian mixture models. In Proceedings of the 2007 IEEE International Conference on Acoustics, Speech and Signal Processing-ICASSP’07, IEEE, Honolulu, HI, USA, 15–20 April 2007; Volume 4, pp. IV-317–IV-320. [Google Scholar]
- Goldberger; Gordon; Greenspan. An efficient image similarity measure based on approximations of KL-divergence between two Gaussian mixtures. In Proceedings of the Ninth IEEE International Conference on Computer Vision, Nice, France, 13–16 October 2003; pp. 487–493. [Google Scholar]
- Huber, M.F.; Bailey, T.; Durrant-Whyte, H.; Hanebeck, U.D. On entropy approximation for Gaussian mixture random vectors. In Proceedings of the 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Republic of Korea, 20–22 August 2008; pp. 181–188. [Google Scholar]
- Kolchinsky, A.; Tracey, B.D. Estimating mixture entropy with pairwise distances. Entropy 2017, 19, 361. [Google Scholar] [CrossRef]
- Cox, J.; Kilian, J.; Leighton, F.; Shamoon, T. Secure spread spectrum watermarking for multimedia. IEEE Trans. Image Process. 1997, 6, 1673–1687. [Google Scholar] [CrossRef] [PubMed]
- Wu, S.; Huang, Y.; Guan, H.; Zhang, S.; Liu, J. ECSS: High-Embedding-Capacity Audio Watermarking with Diversity Reception. Entropy 2022, 22, 1843. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Joudeh, B.; Škorić, B. Average Entropy of Gaussian Mixtures. Entropy 2024, 26, 659. https://doi.org/10.3390/e26080659
Joudeh B, Škorić B. Average Entropy of Gaussian Mixtures. Entropy. 2024; 26(8):659. https://doi.org/10.3390/e26080659
Chicago/Turabian StyleJoudeh, Basheer, and Boris Škorić. 2024. "Average Entropy of Gaussian Mixtures" Entropy 26, no. 8: 659. https://doi.org/10.3390/e26080659
APA StyleJoudeh, B., & Škorić, B. (2024). Average Entropy of Gaussian Mixtures. Entropy, 26(8), 659. https://doi.org/10.3390/e26080659