A Generalized Relative (α, β)-Entropy: Geometric Properties and Applications to Robust Statistical Inference
Abstract
:1. Introduction
- We present a two parameter extension of the relative -entropy measure in (6) motivated by the logarithmic S-divergence measures. These divergence measures are known to generate more robust statistical inference compared to the LDPD measures related to the relative -entropy.
- In the new formulation of the relative -entropy, the LSD measures are linked with several important information theoretic divergences and entropy measures like the ones named after Renyi. A new divergence family is discovered corresponding to case (properly standardized) for the finite measure cases.
- As a by-product of our new formulation, we get a new two-parameter generalization of the Renyi entropy measure, which we refer to as the Generalized Renyi entropy (GRE). This opens up a new area of research to examine the detailed properties of GRE and its use in complex problems in statistical physics and information theory. In this paper, we show that this new GRE satisfies the basic entropic characteristics, i.e., it is zero when the argument probability is degenerate and is maximum when the probability is uniform.
- Here we provide a detailed geometric analysis of the robust LSD measure, or equivalently the relative -entropy in our new formulation. In particular, we show their continuity or lower semi-continuity with respect to the first argument depending on the values of the tuning parameters and . Also, its lower semi-continuity with respect to the second argument is proved.
- We also study the convexity of the LSD measures (or the relative -entropies) with respect to its argument densities. The relative -entropy (i.e, the relative -entropy at ) is known to be quasi-convex [16] only in its first argument. Here, we will show that, for general and , the relative -entropies are not quasi-convex on the space of densities, but they are always quasi-convex with respect to both the arguments on a suitably (power) transformed space of densities. Such convexity results in the second argument were unavailable in the literature even for the relative -entropy, which we will introduce in this paper through a transformation of space.
- Like the relative -entropy, but unlike the relative entropy in (2), our new relative -entropy also does not satisfy the data processing inequalities. However, we prove an extended Pythagorean relation for the relative -entropy which makes it reasonable to treat them as “squared distances” and talk about their projections.
- The forward projection of a relative entropy or a suitable divergence, i.e., their minimization with respect to the first argument, is very important for both statistical physics and information theory. This is indeed equivalent to the maximum entropy principle and is also related to the Gibbs conditioning principle. In this paper, we will examine the conditions under which such a forward projection of the relative -entropy (or, LSD) exists and is unique.
- Finally, for completeness, we briefly present the application of the LSD measure or the relative -entropy measure in robust statistical inference in the spirit of [78,79] but now with extended range of tuning parameters. It uses the reverse projection principle; a result on the existence of the minimum LSD functional is first presented with the new formulation of this paper. Numerical illustrations are provided for the binomial model, where we additionally study their properties for the extended tuning parameter range as well as for some new divergence families (related to ). Brief indications of the potential use of these divergences in testing of statistical hypotheses are also provided.
2. The Relative -Entropy Measure
2.1. Definition: An Extension of the Relative -Entropy
- 1.
- P is not absolutely continuous with respect to Q and , in which case .
- 2.
- P is mutually singular to Q and , in which case also .
- 3.
- and , in which case also is undefined.
2.2. Relations with Different Existing or New Entropies and Divergences
- 1
- if P is degenerate at a point in Ω (no uncertainty).
- 2
- if P is uniform over Ω (maximum uncertainty).
3. Geometry of the Relative -Entropy
3.1. Continuity
3.2. Convexity
3.3. Extended Pythagorean Relation
- (i)
- Suppose and are finite. Then, for all , i.e., the back-transformation of line segment joining and on to does not intersect , if and only if
- (ii)
- Suppose is finite for some fixed . Then, the back-transformation of line segment joining and on to does not intersect if and only if
4. The Forward Projection of Relative -Entropy
5. Statistical Applications: The Minimum Relative Entropy Inference
5.1. The Reverse Projection and Parametric Estimation
- (1)
- Suppose for some . Then the unique MRE functional is given by .
- (2)
- Suppose . If is convex and closed and for some , the MRE functional exists and is unique.
5.2. Numerical Illustration: Binomial Model
- Under pure data with no contamination, the maximum likelihood estimator (the MREE at ) has the least bias and MSE as expected, which further decrease as sample size increases.
- As we move away from and in either direction, the MSEs of the corresponding MREEs under pure data increase slightly; but as long as the tuning parameters remain within a reasonable window of the point and neither component is very close to zero, this loss in efficiency is not very significant.
- When or approaches zero, the MREEs become somewhat unstable generating comparatively larger MSE values. This is probably due to the presence of inliers under the discrete binomial model. Note that, the relative -entropy measures with are not finitely defined for the binomial model if there is just only one empty cell present in the data.
- Under contamination, the bias and MSE of the maximum likelihood estimator increase significantly but many MREEs remains stable. In particular, the MREEs with and the MREEs with close to zero are non-robust against data contamination. Many of the remaining members of the MREE family provide significantly improved robust estimators.
- In the entire simulation, the combination appears to provide the most stable results. In Table 4, the best results are available along a tubular region which moves from the top left-hand to the bottom right-hand of the table subject to the conditions that and none of them are very close to zero.
- Based on our numerical experiments, the optimum range of values of , providing the most robust minimum relative -estimators are , and , . Note that this range includes the estimators based on the logarithmic power divergence measure as well as the new LSD measures with .
- Many of the MREEs, which belong to the optimum range mentioned in the last item and are close to the combination , generally also provide the best trade-off between efficiency under pure data and robustness under contaminated data.
5.3. Application to Testing Statistical Hypothesis
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
KLD | Kullback-Leibler Divergence |
LDPD | Logarithmic Density Power Divergence |
LSD | Logarithmic Super Divergence |
GRE | Generalized Renyi Entropy |
MRE | Minimum Relative -entropy |
MREE | Minimum Relative -entropy Estimator |
MSE | Mean Squared Error |
References
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Shannon, C.E. Communication in the presence of noise. Proc. IRE 1949, 37, 10–21. [Google Scholar] [CrossRef]
- Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
- Khinchin, A.I. The entropy concept in probability theory. Uspekhi Matematicheskikh Nauk 1953, 8, 3–20. [Google Scholar]
- Khinchin, A.I. On the fundamental theorems of information theory. Uspekhi Matematicheskikh Nauk 1956, 11, 17–75. [Google Scholar]
- Khinchin, A.I. Mathematical Foundations of Information Theory; Dover Publications: New York, NY, USA, 1957. [Google Scholar]
- Kolmogorov, A.N. Foundations of the Theory of Probability; Chelsea Publishing Co.: New York, NY, USA, 1950. [Google Scholar]
- Kolmogorov, A.N. On the Shannon theory of information transmission in the case of continuous signals. IRE Trans. Inf. Theory 1956, IT-2, 102–108. [Google Scholar] [CrossRef]
- Kullback, S. An application of information theory to multivariate analysis. Ann. Math. Stat. 1952, 23, 88–102. [Google Scholar] [CrossRef]
- Kullback, S. A note on information theory. J. Appl. Phys. 1953, 24, 106–107. [Google Scholar] [CrossRef]
- Kullback, S. Certain inequalities in information theory and the Cramer-Rao inequality. Ann. Math. Stat. 1954, 25, 745–751. [Google Scholar] [CrossRef]
- Kullback, S. An application of information theory to multivariate analysis II. Ann. Math. Stat. 1956, 27, 122–145. [Google Scholar] [CrossRef]
- Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Rosenkrantz, R.D. E T Jaynes: Papers on Probability, Statistics and Statistical Physics; Springer Science and Business Media: New York, NY, USA, 1983. [Google Scholar]
- Van Campenhout, J.M.; Cover, T.M. Maximum entropy and conditional probability. IEEE Trans. Inf. Theory 1981, 27, 483–489. [Google Scholar] [CrossRef]
- Kumar, M.A.; Sundaresan, R. Minimization Problems Based on Relative α-Entropy I: Forward Projection. IEEE Trans. Inf. Theory 2015, 61, 5063–5080. [Google Scholar] [CrossRef]
- Sundaresan, R. Guessing under source uncertainty. Proc. IEEE Trans. Inf. Theory 2007, 53, 269–287. [Google Scholar] [CrossRef]
- Csiszár, I. I-divergence geometry of probability distributions and minimization problems. Ann. Probab. 1975, 3, 146–158. [Google Scholar] [CrossRef]
- Csiszár, I. Sanov property, generalized I -projection, and a conditional limit theorem. Ann. Probab. 1984, 12, 768–793. [Google Scholar] [CrossRef]
- Csiszár, I.; Shields, P. Information Theory and Statistics: A Tutorial; NOW Publishers: Hanover, NH, USA, 2004. [Google Scholar]
- Csiszár, I.; Tusnady, G. Information geometry and alternating minimization procedures. Stat. Decis. 1984, 1, 205–237. [Google Scholar]
- Amari, S.I.; Karakida, R.; Oizumi, M. Information Geometry Connecting Wasserstein Distance and Kullback-Leibler Divergence via the Entropy-Relaxed Transportation Problem. arXiv, 2017; arXiv:1709.10219. [Google Scholar]
- Costa, S.I.; Santos, S.A.; Strapasson, J.E. Fisher information distance: A geometrical reading. Discret. Appl. Math. 2015, 197, 59–69. [Google Scholar] [CrossRef]
- Nielsen, F.; Sun, K. Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures. IEEE Signal Process. Lett. 2016, 23, 1543–1546. [Google Scholar] [CrossRef]
- Amari, S.I.; Cichocki, A. Information geometry of divergence functions. Bull. Pol. Acad. Sci. Tech. Sci. 2010, 58, 183–195. [Google Scholar] [CrossRef]
- Contreras-Reyes, J.E.; Arellano-Valle, R.B. Kullback-Leibler divergence measure for multivariate skew-normal distributions. Entropy 2012, 14, 1606–1626. [Google Scholar] [CrossRef]
- Nielsen, F.; Boltz, S. The Burbea-Rao and Bhattacharyya Centroids. IEEE Trans. Inf. Theory 2011, 57, 5455–5466. [Google Scholar] [CrossRef]
- Pinski, F.J.; Simpson, G.; Stuart, A.M.; Weber, H. Kullback–Leibler approximation for probability measures on infinite dimensional spaces. SIAM J. Math. Anal. 2015, 47, 4091–4122. [Google Scholar] [CrossRef]
- Attouch, H.; Bolte, J.; Redont, P.; Soubeyran, A. Proximal alternating minimization and projection methods for nonconvex problems: An approach based on the Kurdyka-Lojasiewicz inequality. Math. Oper. Res. 2010, 35, 438–457. [Google Scholar] [CrossRef] [Green Version]
- Eliazar, I.; Sokolov, I.M. Maximization of statistical heterogeneity: From Shannon’s entropy to Gini’s index. Phys. A Stat. Mech. Appl. 2010, 389, 3023–3038. [Google Scholar] [CrossRef]
- Monthus, C. Non-equilibrium steady states: Maximization of the Shannon entropy associated with the distribution of dynamical trajectories in the presence of constraints. J. Stat. Mech. Theory Exp. 2011, 2011, P03008. [Google Scholar] [CrossRef]
- Bafroui, H.H.; Ohadi, A. Application of wavelet energy and Shannon entropy for feature extraction in gearbox fault detection under varying speed conditions. Neurocomputing 2014, 133, 437–445. [Google Scholar] [CrossRef]
- Batty, M. Space, Scale, and Scaling in Entropy Maximizing. Geogr. Anal. 2010, 42, 395–421. [Google Scholar] [CrossRef]
- Oikonomou, T.; Bagci, G.B. Entropy Maximization with Linear Constraints: The Uniqueness of the Shannon Entropy. arXiv, 2018; arXiv:1803.02556. [Google Scholar]
- Hoang, D.T.; Song, J.; Periwal, V.; Jo, J. Maximizing weighted Shannon entropy for network inference with little data. arXiv, 2017; arXiv:1705.06384. [Google Scholar]
- Sriraman, T.; Chakrabarti, B.; Trombettoni, A.; Muruganandam, P. Characteristic features of the Shannon information entropy of dipolar Bose-Einstein condensates. J. Chem. Phys. 2017, 147, 044304. [Google Scholar] [CrossRef] [PubMed]
- Sun, M.; Li, Y.; Gemmeke, J.F.; Zhang, X. Speech enhancement under low SNR conditions via noise estimation using sparse and low-rank NMF with Kullback-Leibler divergence. IEEE Trans. Audio Speech Lang. Process. 2015, 23, 1233–1242. [Google Scholar] [CrossRef]
- Garcia-Fernandez, A.F.; Vo, B.N. Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization. IEEE Trans. Signal Process. 2015, 63, 5812–5820. [Google Scholar] [CrossRef]
- Giantomassi, A.; Ferracuti, F.; Iarlori, S.; Ippoliti, G.; Longhi, S. Electric motor fault detection and diagnosis by kernel density estimation and Kullback-Leibler divergence based on stator current measurements. IEEE Trans. Ind. Electron. 2015, 62, 1770–1780. [Google Scholar] [CrossRef]
- Harmouche, J.; Delpha, C.; Diallo, D.; Le Bihan, Y. Statistical approach for nondestructive incipient crack detection and characterization using Kullback-Leibler divergence. IEEE Trans. Reliab. 2016, 65, 1360–1368. [Google Scholar] [CrossRef]
- Hua, X.; Cheng, Y.; Wang, H.; Qin, Y.; Li, Y.; Zhang, W. Matrix CFAR detectors based on symmetrized Kullback-Leibler and total Kullback-Leibler divergences. Digit. Signal Process. 2017, 69, 106–116. [Google Scholar] [CrossRef]
- Ferracuti, F.; Giantomassi, A.; Iarlori, S.; Ippoliti, G.; Longhi, S. Electric motor defects diagnosis based on kernel density estimation and Kullback-Leibler divergence in quality control scenario. Eng. Appl. Artif. Intell. 2015, 44, 25–32. [Google Scholar] [CrossRef]
- Matthews, A.G.D.G.; Hensman, J.; Turner, R.; Ghahramani, Z. On sparse variational methods and the Kullback-Leibler divergence between stochastic processes. J. Mach. Learn. Res. 2016, 51, 231–239. [Google Scholar]
- Arikan, E. An inequality on guessing and its application to sequential decoding. IEEE Trans. Inf. Theory 1996, 42, 99–105. [Google Scholar] [CrossRef] [Green Version]
- Campbell, L.L. A coding theorem and Renyi’s entropy. Inf. Control 1965, 8, 423–429. [Google Scholar] [CrossRef]
- Renyi, A. On measures of entropy and information. In Proceedings of 4th Berkeley Symposium on Mathematical Statistics and Probability I; University of California: Berkeley, CA, USA, 1961; pp. 547–561. [Google Scholar]
- Wei, B.B. Relations between heat exchange and Rényi divergences. Phys. Rev. E 2018, 97, 042107. [Google Scholar] [CrossRef]
- Kumar, M.A.; Sason, I. On projections of the Rényi divergence on generalized convex sets. In Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, 10–15 July 2016. [Google Scholar]
- Sadeghpour, M.; Baratpour, S.; Habibirad, A. Exponentiality test based on Renyi distance between equilibrium distributions. Commun. Stat.-Simul. Comput. 2017. [Google Scholar] [CrossRef]
- Markel, D.; El Naqa, I.I. PD-0351: Development of a novel regmentation framework using the Jensen Renyi divergence for adaptive radiotherapy. Radiother. Oncol. 2014, 111, S134. [Google Scholar] [CrossRef]
- Bai, S.; Lepoint, T.; Roux-Langlois, A.; Sakzad, A.; Stehlé, D.; Steinfeld, R. Improved security proofs in lattice-based cryptography: Using the Rényi divergence rather than the statistical distance. J. Cryptol. 2018, 31, 610–640. [Google Scholar] [CrossRef]
- Dong, X. The gravity dual of Rényi entropy. Nat. Commun. 2016, 7, 12472. [Google Scholar] [CrossRef] [PubMed]
- Kusuki, Y.; Takayanagi, T. Renyi entropy for local quenches in 2D CFT from numerical conformal blocks. J. High Energy Phys. 2018, 2018, 115. [Google Scholar] [CrossRef]
- Kumbhakar, M.; Ghoshal, K. One-Dimensional velocity distribution in open channels using Renyi entropy. Stoch. Environ. Res. Risk Assess. 2017, 31, 949–959. [Google Scholar] [CrossRef]
- Xing, H.J.; Wang, X.Z. Selective ensemble of SVDDs with Renyi entropy based diversity measure. Pattern Recog. 2017, 61, 185–196. [Google Scholar] [CrossRef]
- Nie, F.; Zhang, P.; Li, J.; Tu, T. An Image Segmentation Method Based on Renyi Relative Entropy and Gaussian Distribution. Recent Patents Comput. Sci. 2017, 10, 122–130. [Google Scholar] [CrossRef]
- Ben Bassat, M. f-entropies, probability of error, and feature selection. Inf. Control 1978, 39, 277–292. [Google Scholar] [CrossRef]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Kumar, S.; Ram, G.; Gupta, V. Axioms for (α, β, γ)-entropy of a generalized probability scheme. J. Appl. Math. Stat. Inf. 2013, 9, 95–106. [Google Scholar] [CrossRef]
- Kumar, S.; Ram, G. A generalization of the Havrda-Charvat and Tsallis entropy and its axiomatic characterization. Abstr. Appl. Anal. 2014, 2014, 505184. [Google Scholar] [CrossRef]
- Tsallis, C.; Brigatti, E. Nonextensive statistical mechanics: A brief introduction. Contin. Mech. Thermodyn. 2004, 16, 223–235. [Google Scholar] [CrossRef]
- Rajesh, G.; Sunoj, S.M. Some properties of cumulative Tsallis entropy of order α. Stat. Pap. 2016. [Google Scholar] [CrossRef]
- Singh, V.P. Introduction to Tsallis Entropy Theory in Water Engineering; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
- Pavlos, G.P.; Karakatsanis, L.P.; Iliopoulos, A.C.; Pavlos, E.G.; Tsonis, A.A. Nonextensive Statistical Mechanics: Overview of Theory and Applications in Seismogenesis, Climate, and Space Plasma. In Advances in Nonlinear Geosciences; Tsonis, A., Ed.; Springer: Cham, Switzerland, 2018; pp. 465–495. [Google Scholar]
- Jamaati, M.; Mehri, A. Text mining by Tsallis entropy. Phys. A Stat. Mech. Appl. 2018, 490, 1368–1376. [Google Scholar] [CrossRef]
- Basu, A.; Shioya, H.; Park, C. Statistical Inference: The Minimum Distance Approach; Chapman & Hall/CRC: Boca Raton, FL, USA, 2011. [Google Scholar]
- Leise, F.; Vajda, I. On divergence and information in statistics and information theory. IEEE Trans. Inf. Theory 2006, 52, 4394–4412. [Google Scholar] [CrossRef]
- Pardo, L. Statistical Inference Based on Divergences; CRC/Chapman-Hall: London, UK, 2006. [Google Scholar]
- Vajda, I. Theory of Statistical Inference and Information; Kluwer: Boston, MA, USA, 1989. [Google Scholar]
- Stummer, W.; Vajda, I. On divergences of finite measures and their applicability in statistics and information theory. Statistics 2010, 44, 169–187. [Google Scholar] [CrossRef]
- Sundaresan, R. A measure of discrimination and its geometric properties. In Proceedings of the IEEE International Symposium on Information Theory, Lausanne, Switzerland, 30 June–5 July 2002. [Google Scholar]
- Lutwak, E.; Yang, D.; Zhang, G. Cramear-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information. IEEE Trans. Inf. Theory 2005, 51, 473–478. [Google Scholar] [CrossRef]
- Kumar, M.A.; Sundaresan, R. Minimization Problems Based on Relative α-Entropy II: Reverse Projection. IEEE Trans. Infor. Theory 2015, 61, 5081–5095. [Google Scholar] [CrossRef]
- Jones, M.C.; Hjort, N.L.; Harris, I.R.; Basu, A. A comparison of related density-based minimum divergence estimators. Biometrika 2001, 88, 865–873. [Google Scholar] [CrossRef]
- Windham, M. Robustifying model fitting. J. R. Stat. Soc. Ser. B 1995, 57, 599–609. [Google Scholar]
- Fujisawa, H. Normalized estimating equation for robust parameter estimation. Elect. J. Stat. 2013, 7, 1587–1606. [Google Scholar] [CrossRef]
- Fujisawa, H.; Eguchi, S. Robust parameter estimation with a small bias against heavy contamination. J. Multivar. Anal. 2008, 99, 2053–2081. [Google Scholar] [CrossRef]
- Maji, A.; Ghosh, A.; Basu, A. The Logarithmic Super Divergence and its use in Statistical Inference. arXiv, 2014; arXiv:1407.3961. [Google Scholar]
- Maji, A.; Ghosh, A.; Basu, A. The Logarithmic Super Divergence and Asymptotic Inference Properties. AStA Adv. Stat. Anal. 2016, 100, 99–131. [Google Scholar] [CrossRef]
- Maji, A.; Chakraborty, S.; Basu, A. Statistical Inference Based on the Logarithmic Power Divergence. Rashi 2017, 2, 39–51. [Google Scholar]
- Lutz, E. Anomalous diffusion and Tsallis statistics in an optical lattice. Phys. Rev. A 2003, 67, 051402. [Google Scholar] [CrossRef]
- Douglas, P.; Bergamini, S.; Renzoni, F. Tunable Tsallis Distributions in Dissipative Optical Lattices. Phys. Rev. Lett. 2006, 96, 110601. [Google Scholar] [CrossRef] [PubMed]
- Burlaga, L.F.; Viñas, A.F. Triangle for the entropic index q of non-extensive statistical mechanics observed by Voyager 1 in the distant heliosphere. Phys. A Stat. Mech. Appl. 2005, 356, 375. [Google Scholar] [CrossRef]
- Liu, B.; Goree, J. Superdiffusion and Non-Gaussian Statistics in a Driven-Dissipative 2D Dusty Plasma. Phys. Rev. Lett. 2008, 100, 055003. [Google Scholar] [CrossRef] [PubMed]
- Pickup, R.; Cywinski, R.; Pappas, C.; Farago, B.; Fouquet, P. Generalized Spin-Glass Relaxation. Phys. Rev. Lett. 2009, 102, 097202. [Google Scholar] [CrossRef] [PubMed]
- Devoe, R. Power-Law Distributions for a Trapped Ion Interacting with a Classical Buffer Gas. Phys. Rev. Lett. 2009, 102, 063001. [Google Scholar] [CrossRef] [PubMed]
- Khachatryan, V.; Sirunyan, A.; Tumasyan, A.; Adam, W.; Bergauer, T.; Dragicevic, M.; Erö, J.; Fabjan, C.; Friedl, M.; Frühwirth, R.; et al. Transverse-Momentum and Pseudorapidity Distributions of Charged Hadrons in pp Collisions at = 7 TeV. Phys. Rev. Lett. 2010, 105, 022002. [Google Scholar] [CrossRef] [PubMed]
- Chatrchyan, S.; Khachatryan, V.; Sirunyan, A.M.; Tumasyan, A.; Adam, W.; Bergauer, T.; Dragicevic, M.; Erö, J.; Fabjan, C.; Friedl, M.; et al. Charged particle transverse momentum spectra in pp collisions at = 0.9 and 7 TeV. J. High Energy Phys. 2011, 2011, 86. [Google Scholar] [CrossRef]
- Adare, A.; Afanasiev, S.; Aidala, C.; Ajitanand, N.; Akiba, Y.; Al-Bataineh, H.; Alexander, J.; Aoki, K.; Aphecetche, L.; Armendariz, R.; et al. Measurement of neutral mesons in p + p collisions at = 200 GeV and scaling properties of hadron production. Phys. Rev. D 2011, 83, 052004. [Google Scholar] [CrossRef]
- Majhi, A. Non-extensive statistical mechanics and black hole entropy from quantum geometry. Phys. Lett. B 2017, 775, 32–36. [Google Scholar] [CrossRef]
- Shore, J.E.; Johnson, R.W. Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy. IEEE Trans. Inf. Theory 1980, 26, 26–37. [Google Scholar] [CrossRef]
- Caticha, A.; Giffin, A. Updating Probabilities. AIP Conf. Proc. 2006, 872, 31–42. [Google Scholar]
- Presse, S.; Ghosh, K.; Lee, J.; Dill, K.A. Nonadditive Entropies Yield Probability Distributions with Biases not Warranted by the Data. Phys. Rev. Lett. 2013, 111, 180604. [Google Scholar] [CrossRef] [PubMed]
- Presse, S. Nonadditive entropy maximization is inconsistent with Bayesian updating. Phys. Rev. E 2014, 90, 052149. [Google Scholar] [CrossRef] [PubMed]
- Presse, S.; Ghosh, K.; Lee, J.; Dill, K.A. Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”. Entropy 2015, 17, 5043–5046. [Google Scholar] [CrossRef] [Green Version]
- Vanslette, K. Entropic Updating of Probabilities and Density Matrices. Entropy 2017, 19, 664. [Google Scholar] [CrossRef]
- Cressie, N.; Read, T.R.C. Multinomial goodness-of-fit tests. J. R. Stat. Soc. B 1984, 46, 440–464. [Google Scholar]
- Csiszár, I. Eine informations theoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizitat von Markoffschen Ketten. Publ. Math. Inst. Hung. Acad. Sci. 1963, 3, 85–107. (In German) [Google Scholar]
- Csiszár, I. Information-type measures of difference of probability distributions and indirect observations. Stud. Scientiarum Math. Hung. 1967, 2, 299–318. [Google Scholar]
- Csiszár, I. On topological properties of f-divergences. Stud. Scientiarum Math. Hung. 1967, 2, 329–339. [Google Scholar]
- Csiszár, I. A class of measures of informativity of observation channels. Priodica Math. Hung. 1972, 2, 191–213. [Google Scholar] [CrossRef]
- Csiszár, I. Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Ann. Stat. 1991, 19, 2032–2066. [Google Scholar] [CrossRef]
- Lindsay, B.G. Efficiency versus robustness: The case for minimum Hellinger distance and related methods. Ann. Stat. 1994, 22, 1081–1114. [Google Scholar] [CrossRef]
- Esteban, M.D.; Morales, D. A summary of entropy statistics. Kybernetica 1995, 31, 337–346. [Google Scholar]
- Itakura, F.; Saito, S. Analysis synthesis telephony based on the maximum likelihood method. In Proceedings of the 6th International Congress on Acoustics, Tokyo, Japan, 21–28 August 1968. [Google Scholar]
- Fevotte, C.; Bertin, N.; Durrieu, J.L. Nonnegative Matrix Factorization with the Itakura–Saito Divergence: With application to music analysis. Neural Comput. 2009, 21, 793–830. [Google Scholar] [CrossRef] [PubMed]
- Teboulle, M.; Vajda, I. Convergence of best ϕ-entropy estimates. IEEE Trans. Inf. Theory 1993, 39, 297–301. [Google Scholar] [CrossRef]
- Basu, A.; Lindsay, B.G. Minimum disparity estimation for continuous models: Efficiency, distributions and robustness. Ann. Inst. Stat. Math. 1994, 46, 683–705. [Google Scholar] [CrossRef]
- Broniatowski, M.; Keziou, A. Parametric estimation and tests through divergences and the duality technique. J. Multivar. Anal. 2009, 100, 16–36. [Google Scholar] [CrossRef]
- Broniatowski, M.; Vajda, I. Several applications of divergence criteria in continuous families. Kybernetika 2012, 48, 600–636. [Google Scholar]
- Toma, A. Optimal robust M-estimators using divergences. Stat. Probab. Lett. 2009, 79, 1–5. [Google Scholar] [CrossRef]
- Marazzi, A.; Yohai, V. Optimal robust estimates using the Hellinger distance. Adv. Data Anal. Classif. 2010, 4, 169–179. [Google Scholar] [CrossRef]
- Toma, A.; Leoni-Aubin, S. Optimal robust M-estimators using Renyi pseudodistances. J. Multivar. Anal. 2010, 115, 359–373. [Google Scholar] [CrossRef]
0.3 | 0.5 | 0.7 | 0.9 | 1 | 1.1 | 1.3 | 1.5 | 1.7 | 2 | |
---|---|---|---|---|---|---|---|---|---|---|
0.1 | −0.210 | −0.416 | −0.397 | −0.311 | −0.277 | −0.227 | −0.130 | 0.021 | 0.024 | 0.122 |
0.3 | 2.218 | −0.273 | −0.229 | −0.160 | −0.141 | −0.115 | −0.096 | −0.068 | −0.036 | 0.034 |
0.5 | −0.127 | 0.001 | −0.125 | −0.088 | −0.082 | −0.069 | −0.058 | −0.042 | −0.032 | −0.019 |
0.7 | −0.093 | −0.110 | −0.010 | −0.046 | −0.044 | −0.029 | −0.023 | −0.031 | −0.023 | −0.020 |
0.9 | −0.066 | −0.056 | −0.028 | −0.001 | −0.015 | −0.002 | 0.008 | 0.000 | −0.006 | −0.013 |
1 | −0.041 | −0.045 | −0.017 | 0.005 | −0.002 | 0.011 | 0.014 | 0.012 | 0.008 | −0.003 |
1.3 | −0.035 | −0.013 | 0.023 | 0.036 | 0.030 | 0.039 | 0.088 | 0.039 | 0.035 | 0.021 |
1.5 | −0.003 | 0.012 | 0.048 | 0.053 | 0.047 | 0.058 | 0.053 | 0.170 | 0.048 | 0.035 |
1.7 | 0.012 | 0.028 | 0.058 | 0.067 | 0.061 | 0.070 | 0.070 | 0.058 | 0.269 | 0.045 |
2 | 0.008 | 0.049 | 0.078 | 0.084 | 0.078 | 0.086 | 0.087 | 0.078 | 0.069 | 0.444 |
0.1 | −0.085 | −0.301 | −0.254 | −0.183 | −0.156 | −0.106 | −0.002 | 0.114 | 0.292 | 0.245 |
0.3 | 1.829 | −0.176 | −0.150 | −0.078 | −0.066 | −0.042 | −0.045 | −0.014 | 0.005 | 0.030 |
0.5 | −0.056 | 0.099 | −0.054 | −0.037 | −0.033 | −0.026 | −0.019 | −0.009 | −0.007 | −0.005 |
0.7 | −0.009 | −0.059 | 0.035 | −0.012 | −0.013 | −0.005 | −0.002 | −0.009 | −0.002 | 0.006 |
0.9 | −0.031 | −0.031 | −0.009 | 0.012 | 0.002 | 0.013 | 0.021 | 0.015 | 0.008 | 0.004 |
1 | 0.014 | −0.023 | 0.000 | 0.011 | 0.009 | 0.019 | 0.022 | 0.020 | 0.018 | 0.004 |
1.3 | 0.002 | −0.004 | 0.022 | 0.034 | 0.027 | 0.030 | 0.084 | 0.034 | 0.035 | 0.028 |
1.5 | 0.009 | 0.023 | 0.038 | 0.044 | 0.037 | 0.042 | 0.034 | 0.174 | 0.040 | 0.032 |
1.7 | 0.028 | 0.029 | 0.049 | 0.054 | 0.047 | 0.050 | 0.047 | 0.036 | 0.277 | 0.039 |
2 | 0.040 | 0.051 | 0.065 | 0.068 | 0.059 | 0.063 | 0.060 | 0.051 | 0.041 | 0.464 |
0.1 | −0.028 | −0.216 | −0.175 | −0.113 | −0.103 | −0.063 | 0.036 | 0.169 | 0.452 | 0.349 |
0.3 | 1.874 | −0.135 | −0.125 | −0.052 | −0.044 | −0.022 | −0.038 | −0.023 | 0.009 | 0.024 |
0.5 | −0.002 | 0.146 | −0.034 | −0.026 | −0.025 | −0.021 | −0.019 | -0.001 | −0.008 | −0.009 |
0.7 | 0.000 | −0.042 | 0.045 | −0.009 | −0.013 | −0.009 | 0.000 | −0.009 | −0.008 | −0.001 |
0.9 | 0.007 | −0.025 | −0.015 | 0.001 | −0.004 | 0.005 | 0.009 | 0.013 | −0.001 | −0.003 |
1 | 0.014 | −0.010 | −0.007 | −0.001 | −0.001 | 0.005 | 0.009 | 0.014 | 0.010 | 0.009 |
1.3 | 0.036 | 0.010 | 0.006 | 0.015 | 0.010 | 0.010 | 0.065 | 0.012 | 0.019 | 0.014 |
1.5 | 0.041 | 0.023 | 0.018 | 0.022 | 0.017 | 0.018 | 0.006 | 0.158 | 0.016 | 0.015 |
1.7 | 0.052 | 0.027 | 0.028 | 0.032 | 0.024 | 0.025 | 0.016 | 0.009 | 0.267 | 0.019 |
2 | 0.056 | 0.043 | 0.042 | 0.043 | 0.033 | 0.034 | 0.023 | 0.020 | 0.013 | 0.454 |
0.3 | 0.5 | 0.7 | 0.9 | 1 | 1.1 | 1.3 | 1.5 | 1.7 | 2 | |
---|---|---|---|---|---|---|---|---|---|---|
0.1 | 0.347 | 0.251 | 0.222 | 0.145 | 0.122 | 0.106 | 0.098 | 0.242 | 0.206 | 0.240 |
0.3 | 7.506 | 0.147 | 0.100 | 0.069 | 0.063 | 0.059 | 0.059 | 0.062 | 0.098 | 0.169 |
0.5 | 0.238 | 0.076 | 0.067 | 0.051 | 0.049 | 0.047 | 0.050 | 0.055 | 0.064 | 0.101 |
0.7 | 0.177 | 0.091 | 0.056 | 0.045 | 0.044 | 0.043 | 0.045 | 0.055 | 0.056 | 0.071 |
0.9 | 0.163 | 0.085 | 0.061 | 0.045 | 0.042 | 0.043 | 0.047 | 0.053 | 0.058 | 0.064 |
1 | 0.171 | 0.085 | 0.064 | 0.045 | 0.042 | 0.045 | 0.048 | 0.053 | 0.058 | 0.063 |
1.3 | 0.148 | 0.082 | 0.065 | 0.052 | 0.046 | 0.046 | 0.061 | 0.055 | 0.058 | 0.065 |
1.5 | 0.146 | 0.085 | 0.069 | 0.056 | 0.050 | 0.050 | 0.051 | 0.087 | 0.061 | 0.065 |
1.7 | 0.150 | 0.085 | 0.070 | 0.060 | 0.053 | 0.055 | 0.055 | 0.056 | 0.134 | 0.066 |
2 | 0.132 | 0.091 | 0.076 | 0.065 | 0.059 | 0.060 | 0.060 | 0.060 | 0.061 | 0.265 |
0.1 | 0.334 | 0.170 | 0.118 | 0.066 | 0.044 | 0.037 | 0.067 | 0.195 | 0.401 | 0.275 |
0.3 | 5.050 | 0.093 | 0.051 | 0.026 | 0.021 | 0.020 | 0.024 | 0.027 | 0.035 | 0.050 |
0.5 | 0.196 | 0.059 | 0.030 | 0.018 | 0.017 | 0.018 | 0.021 | 0.026 | 0.030 | 0.037 |
0.7 | 0.191 | 0.053 | 0.031 | 0.018 | 0.016 | 0.017 | 0.023 | 0.025 | 0.028 | 0.035 |
0.9 | 0.131 | 0.050 | 0.029 | 0.019 | 0.016 | 0.018 | 0.022 | 0.025 | 0.028 | 0.029 |
1 | 0.154 | 0.044 | 0.031 | 0.018 | 0.017 | 0.020 | 0.022 | 0.024 | 0.027 | 0.031 |
1.3 | 0.112 | 0.046 | 0.029 | 0.023 | 0.018 | 0.018 | 0.033 | 0.028 | 0.029 | 0.031 |
1.5 | 0.108 | 0.049 | 0.033 | 0.024 | 0.020 | 0.022 | 0.022 | 0.059 | 0.031 | 0.031 |
1.7 | 0.119 | 0.049 | 0.036 | 0.026 | 0.022 | 0.023 | 0.025 | 0.025 | 0.108 | 0.033 |
2 | 0.108 | 0.053 | 0.040 | 0.030 | 0.025 | 0.026 | 0.028 | 0.029 | 0.028 | 0.249 |
0.1 | 0.295 | 0.139 | 0.085 | 0.038 | 0.022 | 0.022 | 0.068 | 0.201 | 0.583 | 0.403 |
0.3 | 4.770 | 0.075 | 0.039 | 0.016 | 0.011 | 0.011 | 0.017 | 0.019 | 0.023 | 0.035 |
0.5 | 0.189 | 0.061 | 0.022 | 0.011 | 0.009 | 0.012 | 0.016 | 0.017 | 0.022 | 0.023 |
0.7 | 0.141 | 0.038 | 0.024 | 0.010 | 0.009 | 0.010 | 0.014 | 0.017 | 0.018 | 0.021 |
0.9 | 0.123 | 0.035 | 0.021 | 0.011 | 0.009 | 0.011 | 0.012 | 0.015 | 0.019 | 0.021 |
1 | 0.122 | 0.036 | 0.019 | 0.010 | 0.009 | 0.011 | 0.013 | 0.016 | 0.017 | 0.020 |
1.3 | 0.114 | 0.035 | 0.019 | 0.012 | 0.009 | 0.010 | 0.021 | 0.016 | 0.017 | 0.019 |
1.5 | 0.105 | 0.037 | 0.019 | 0.012 | 0.010 | 0.011 | 0.012 | 0.045 | 0.017 | 0.020 |
1.7 | 0.097 | 0.034 | 0.021 | 0.014 | 0.011 | 0.012 | 0.014 | 0.014 | 0.092 | 0.020 |
2 | 0.088 | 0.039 | 0.023 | 0.016 | 0.012 | 0.013 | 0.013 | 0.016 | 0.016 | 0.227 |
0.3 | 0.5 | 0.7 | 0.9 | 1 | 1.1 | 1.3 | 1.5 | 1.7 | 2 | |
---|---|---|---|---|---|---|---|---|---|---|
0.1 | −0.104 | −0.382 | −0.340 | −0.243 | −0.131 | −0.071 | 0.090 | 0.188 | 0.295 | 0.379 |
0.3 | 3.287 | −0.157 | −0.187 | −0.135 | −0.113 | −0.091 | −0.045 | 0.013 | 0.107 | 0.237 |
0.5 | 2.691 | 1.483 | −0.024 | −0.067 | −0.069 | −0.043 | −0.031 | −0.010 | −0.003 | 0.051 |
0.7 | 3.004 | 2.546 | 1.168 | 0.036 | −0.017 | −0.008 | 0.003 | 0.006 | 0.005 | 0.010 |
0.9 | 3.133 | 2.889 | 2.319 | 0.917 | 0.222 | 0.058 | 0.019 | 0.023 | 0.017 | 0.022 |
1 | 3.183 | 2.986 | 2.558 | 1.619 | 0.805 | 0.214 | 0.039 | 0.030 | 0.031 | 0.019 |
1.3 | 3.239 | 3.121 | 2.902 | 2.550 | 2.262 | 1.872 | 0.613 | 0.077 | 0.049 | 0.040 |
1.5 | 3.255 | 3.170 | 3.012 | 2.775 | 2.606 | 2.396 | 1.676 | 0.571 | 0.069 | 0.051 |
1.7 | 3.271 | 3.194 | 3.071 | 2.903 | 2.790 | 2.661 | 2.256 | 1.489 | 0.578 | 0.057 |
2 | 3.289 | 3.216 | 3.122 | 3.012 | 2.942 | 2.865 | 2.649 | 2.305 | 1.690 | 0.682 |
0.1 | 0.384 | −0.170 | −0.189 | −0.132 | −0.054 | 0.024 | 0.104 | 0.171 | 0.261 | 0.382 |
0.3 | 3.549 | 0.000 | −0.122 | −0.086 | −0.077 | −0.053 | −0.023 | 0.029 | 0.054 | 0.118 |
0.5 | 2.875 | 1.771 | 0.040 | −0.048 | −0.048 | −0.029 | −0.013 | −0.015 | −0.017 | 0.003 |
0.7 | 3.091 | 2.698 | 1.294 | 0.048 | −0.010 | −0.014 | −0.001 | 0.004 | 0.001 | −0.005 |
0.9 | 3.205 | 2.945 | 2.379 | 0.939 | 0.226 | 0.045 | 0.009 | 0.013 | 0.012 | 0.013 |
1 | 3.240 | 3.011 | 2.612 | 1.609 | 0.793 | 0.196 | 0.018 | 0.014 | 0.021 | 0.012 |
1.3 | 3.316 | 3.171 | 2.925 | 2.548 | 2.239 | 1.819 | 0.554 | 0.034 | 0.020 | 0.020 |
1.5 | 3.346 | 3.223 | 3.034 | 2.780 | 2.596 | 2.363 | 1.589 | 0.502 | 0.035 | 0.022 |
1.7 | 3.362 | 3.254 | 3.100 | 2.916 | 2.791 | 2.643 | 2.199 | 1.383 | 0.518 | 0.025 |
2 | 3.373 | 3.281 | 3.162 | 3.035 | 2.955 | 2.865 | 2.622 | 2.236 | 1.575 | 0.650 |
0.1 | 0.610 | −0.138 | −0.105 | −0.031 | 0.002 | 0.040 | 0.117 | 0.184 | 0.270 | 0.381 |
0.3 | 3.906 | 0.136 | −0.071 | −0.050 | −0.052 | −0.028 | −0.028 | −0.008 | 0.023 | 0.066 |
0.5 | 2.927 | 1.934 | 0.101 | −0.034 | −0.027 | −0.016 | 0.006 | 0.000 | −0.003 | −0.008 |
0.7 | 3.122 | 2.761 | 1.348 | 0.066 | 0.004 | −0.007 | 0.007 | 0.011 | 0.012 | 0.000 |
0.9 | 3.241 | 2.955 | 2.406 | 0.958 | 0.238 | 0.047 | 0.004 | 0.014 | 0.022 | 0.017 |
1 | 3.289 | 3.045 | 2.651 | 1.622 | 0.798 | 0.202 | 0.010 | 0.011 | 0.016 | 0.023 |
1.3 | 3.362 | 3.204 | 2.944 | 2.567 | 2.245 | 1.812 | 0.533 | 0.028 | 0.015 | 0.022 |
1.5 | 3.384 | 3.269 | 3.058 | 2.802 | 2.610 | 2.369 | 1.567 | 0.485 | 0.027 | 0.018 |
1.7 | 3.405 | 3.305 | 3.133 | 2.940 | 2.811 | 2.658 | 2.196 | 1.357 | 0.504 | 0.018 |
2 | 3.421 | 3.327 | 3.204 | 3.065 | 2.980 | 2.886 | 2.633 | 2.234 | 1.541 | 0.637 |
0.3 | 0.5 | 0.7 | 0.9 | 1 | 1.1 | 1.3 | 1.5 | 1.7 | 2 | |
---|---|---|---|---|---|---|---|---|---|---|
0.1 | 0.403 | 0.248 | 0.465 | 0.576 | 1.025 | 1.093 | 1.613 | 1.565 | 1.626 | 1.591 |
0.3 | 12.595 | 0.142 | 0.103 | 0.075 | 0.192 | 0.188 | 0.362 | 0.590 | 1.016 | 1.537 |
0.5 | 7.443 | 2.268 | 0.088 | 0.062 | 0.058 | 0.059 | 0.065 | 0.189 | 0.241 | 0.527 |
0.7 | 9.209 | 6.645 | 1.410 | 0.069 | 0.056 | 0.058 | 0.063 | 0.068 | 0.119 | 0.208 |
0.9 | 9.982 | 8.493 | 5.512 | 0.882 | 0.119 | 0.068 | 0.065 | 0.069 | 0.075 | 0.090 |
1 | 10.292 | 9.072 | 6.672 | 2.692 | 0.693 | 0.117 | 0.068 | 0.070 | 0.076 | 0.087 |
1.3 | 10.664 | 9.916 | 8.574 | 6.641 | 5.240 | 3.610 | 0.430 | 0.079 | 0.079 | 0.089 |
1.5 | 10.778 | 10.229 | 9.238 | 7.850 | 6.940 | 5.883 | 2.917 | 0.389 | 0.079 | 0.087 |
1.7 | 10.884 | 10.379 | 9.599 | 8.582 | 7.942 | 7.234 | 5.235 | 2.326 | 0.403 | 0.087 |
2 | 11.004 | 10.515 | 9.915 | 9.233 | 8.814 | 8.369 | 7.177 | 5.472 | 2.998 | 0.547 |
0.1 | 1.552 | 0.815 | 0.741 | 0.703 | 0.966 | 1.190 | 1.129 | 1.224 | 1.165 | 1.210 |
0.3 | 14.969 | 0.105 | 0.047 | 0.030 | 0.078 | 0.075 | 0.280 | 0.559 | 0.566 | 0.881 |
0.5 | 8.345 | 3.190 | 0.049 | 0.025 | 0.021 | 0.022 | 0.025 | 0.029 | 0.035 | 0.184 |
0.7 | 9.634 | 7.335 | 1.694 | 0.031 | 0.020 | 0.022 | 0.027 | 0.029 | 0.033 | 0.039 |
0.9 | 10.353 | 8.723 | 5.712 | 0.898 | 0.077 | 0.027 | 0.028 | 0.030 | 0.032 | 0.039 |
1 | 10.578 | 9.126 | 6.871 | 2.619 | 0.645 | 0.067 | 0.027 | 0.030 | 0.033 | 0.039 |
1.3 | 11.069 | 10.129 | 8.608 | 6.548 | 5.064 | 3.359 | 0.329 | 0.033 | 0.034 | 0.038 |
1.5 | 11.263 | 10.457 | 9.268 | 7.787 | 6.801 | 5.648 | 2.576 | 0.279 | 0.032 | 0.038 |
1.7 | 11.371 | 10.655 | 9.676 | 8.567 | 7.854 | 7.051 | 4.908 | 1.968 | 0.298 | 0.037 |
2 | 11.449 | 10.833 | 10.060 | 9.275 | 8.793 | 8.276 | 6.947 | 5.079 | 2.560 | 0.461 |
0.1 | 2.102 | 0.399 | 0.808 | 0.945 | 0.924 | 0.929 | 0.891 | 1.012 | 1.233 | 1.120 |
0.3 | 17.185 | 0.141 | 0.033 | 0.018 | 0.013 | 0.014 | 0.018 | 0.142 | 0.258 | 0.453 |
0.5 | 8.624 | 3.768 | 0.056 | 0.015 | 0.011 | 0.015 | 0.017 | 0.018 | 0.022 | 0.028 |
0.7 | 9.809 | 7.646 | 1.828 | 0.024 | 0.011 | 0.013 | 0.018 | 0.019 | 0.020 | 0.023 |
0.9 | 10.559 | 8.764 | 5.812 | 0.927 | 0.070 | 0.018 | 0.017 | 0.020 | 0.021 | 0.023 |
1 | 10.870 | 9.312 | 7.058 | 2.648 | 0.645 | 0.057 | 0.017 | 0.019 | 0.021 | 0.023 |
1.3 | 11.342 | 10.306 | 8.691 | 6.619 | 5.068 | 3.312 | 0.297 | 0.020 | 0.020 | 0.023 |
1.5 | 11.494 | 10.727 | 9.379 | 7.880 | 6.845 | 5.646 | 2.484 | 0.251 | 0.021 | 0.021 |
1.7 | 11.632 | 10.960 | 9.848 | 8.675 | 7.932 | 7.101 | 4.866 | 1.873 | 0.272 | 0.022 |
2 | 11.739 | 11.102 | 10.297 | 9.422 | 8.910 | 8.363 | 6.973 | 5.040 | 2.420 | 0.430 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ghosh, A.; Basu, A. A Generalized Relative (α, β)-Entropy: Geometric Properties and Applications to Robust Statistical Inference. Entropy 2018, 20, 347. https://doi.org/10.3390/e20050347
Ghosh A, Basu A. A Generalized Relative (α, β)-Entropy: Geometric Properties and Applications to Robust Statistical Inference. Entropy. 2018; 20(5):347. https://doi.org/10.3390/e20050347
Chicago/Turabian StyleGhosh, Abhik, and Ayanendranath Basu. 2018. "A Generalized Relative (α, β)-Entropy: Geometric Properties and Applications to Robust Statistical Inference" Entropy 20, no. 5: 347. https://doi.org/10.3390/e20050347