Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities †
Abstract
:1. Introduction
2. Main Results
- (1)
- The heat equation holds: .
- (2)
- , , .
- (3)
- The expectation of the product of the , exists, and , .
Log-Concave Case
3. Linear Matrix Inequalities
3.1. Matrices from Multiple Representations
3.2. Matrices from Integration by Parts
- Firstly, since , we separate the blocks of C accordingly,In the above, , , .
- Secondly, each row of corresponds to a symmetric matrix such that . In particular, for the first row of , the matrix is
- Thirdly, for and , the equalities are . Notice cannot be expressed in a quadratic form. Supposing that we can find a column vector z such that , then . The vector z actually lies in the null space of , and it suffices to find the basis. One way is to do the decomposition:Hence, one takes z as the second column of Q, which is (after scaling for conciseness) . Then, one calculates , and the corresponding matrix (scaled by a factor of two) is
3.3. Matrix from the Derivative
3.4. Matrices from Log-Concavity
3.5. Matrix for Gaussian Optimality
3.6. Fifth Derivative
4. Proof of Theorem 1
5. Discussion
5.1. On the Derivatives
5.2. Possible Proofs
5.3. Applications
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Appendix A. Proof of Lemma 4
References
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Stam, A.J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 1959, 2, 101–112. [Google Scholar] [CrossRef]
- Zhang, X.; Anantharam, V.; Geng, Y. Gaussian Extremality for Derivatives of Differential Entropy under the Additive Gaussian Noise Flow. IEEE Int. Symp. Inf. Theory 2018. submitted. [Google Scholar]
- Costa, M. A new entropy power inequality. IEEE Trans. Inf. Theory 1985, 31, 751–760. [Google Scholar] [CrossRef]
- Toscani, G. A concavity property for the reciprocal of Fisher information and its consequences on Costa’s EPI. Phys. A Stat. Mech. Appl. 2015, 432, 35–42. [Google Scholar] [CrossRef]
- Villani, C. A short proof of the “concavity of entropy power”. IEEE Trans. Inf. Theory 2000, 46, 1695–1696. [Google Scholar] [CrossRef]
- McKean, H.P. Speed of approach to equilibrium for Kac’s caricature of a Maxwellian gas. Arch. Ration. Mech. Anal. 1966, 21, 343–367. [Google Scholar] [CrossRef]
- Toscani, G. Entropy production and the rate of convergence to equilibrium for the Fokker-Planck equation. Q. Appl. Math. 1999, 57, 521–541. [Google Scholar] [CrossRef]
- Cheng, F.; Geng, Y. Higher order derivatives in Costa’s entropy power inequality. IEEE Trans. Inf. Theory 2015, 61, 5892–5905. [Google Scholar] [CrossRef]
- Bernstein, S. Sur les fonctions absolument monotones. Acta Math. 1929, 52, 1–66. [Google Scholar] [CrossRef]
- Guo, D.; Wu, Y.; Shitz, S.S.; Verdú, S. Estimation in Gaussian noise: Properties of the minimum mean-square error. IEEE Trans. Inf. Theory 2011, 57, 2371–2385. [Google Scholar]
- Wibisono, A.; Jog, V. Convexity of mutual information along the heat flow. arXiv, 2018; arXiv:1801.06968. [Google Scholar]
- Wang, L.; Madiman, M. Beyond the entropy power inequality, via rearrangements. IEEE Trans. Inf. Theory 2014, 60, 5116–5137. [Google Scholar] [CrossRef]
- Courtade, T.A. Concavity of entropy power: Equivalent formulations and generalizations. In Proceedings of the 2017 IEEE International Symposium on Information Theory (ISIT), Aachen, Germany, 25–30 June 2017; pp. 56–60. [Google Scholar]
- König, R.; Smith, G. The entropy power inequality for quantum systems. IEEE Trans. Inf. Theory 2014, 60, 1536–1548. [Google Scholar] [CrossRef]
- Rioul, O. Information theoretic proofs of entropy power inequalities. IEEE Trans. Inf. Theory 2011, 57, 33–55. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: New York, NY, USA, 2006. [Google Scholar]
- Boyd, S.; Vandenberghe, L. Convex Optimization; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Grant, M.; Boyd, S.; Ye, Y. CVX: Matlab Software for Disciplined Convex Programming. 2008. Available online: http://cvxr.com/cvx/ (accessed on 5 March 2018).
- Weingarten, H.; Steinberg, Y.; Shamai, S.S. The capacity region of the Gaussian multiple-input multiple-output broadcast channel. IEEE Trans. Inf. Theory 2006, 52, 3936–3964. [Google Scholar] [CrossRef]
- Geng, Y.; Nair, C. The capacity region of the two-receiver Gaussian vector broadcast channel with private and common messages. IEEE Trans. Inf. Theory 2014, 60, 2087–2104. [Google Scholar] [CrossRef]
- Toscani, G. Diffusion Equations and Entropy Inequalities. preprint 2016. Available online: http://mate.unipv.it/toscani/publi/Note-Ravello-2016.pdf (accessed on 5 March 2018).
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, X.; Anantharam, V.; Geng, Y. Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities. Entropy 2018, 20, 182. https://doi.org/10.3390/e20030182
Zhang X, Anantharam V, Geng Y. Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities. Entropy. 2018; 20(3):182. https://doi.org/10.3390/e20030182
Chicago/Turabian StyleZhang, Xiaobing, Venkat Anantharam, and Yanlin Geng. 2018. "Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities" Entropy 20, no. 3: 182. https://doi.org/10.3390/e20030182
APA StyleZhang, X., Anantharam, V., & Geng, Y. (2018). Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities. Entropy, 20(3), 182. https://doi.org/10.3390/e20030182