Relative Entropy Derivative Bounds
AbstractWe show that the derivative of the relative entropy with respect to its parameters is lower and upper bounded. We characterize the conditions under which this derivative can reach zero. We use these results to explain when the minimum relative entropy and the maximum log likelihood approaches can be valid. We show that these approaches naturally activate in the presence of large data sets and that they are inherent properties of any density estimation process involving large numbers of random variables.
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Zegers, P.; Fuentes, A.; Alarcón, C. Relative Entropy Derivative Bounds. Entropy 2013, 15, 2861-2873.
Zegers P, Fuentes A, Alarcón C. Relative Entropy Derivative Bounds. Entropy. 2013; 15(7):2861-2873.Chicago/Turabian Style
Zegers, Pablo; Fuentes, Alexis; Alarcón, Carlos. 2013. "Relative Entropy Derivative Bounds." Entropy 15, no. 7: 2861-2873.