Information Distances versus Entropy Metric
Abstract
:1. Introduction
2. Preliminaries
2.1. Kolmogorov Complexity
2.2. Shannon Entropy
3. Information Distance Versus Entropy
- (i)
- if and only if ;
- (ii)
- ;
- (iii)
- .
4. Normalized Information Distance Versus Entropy
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Bennett, C.H.; Gacs, P.; Li, M.; Vitányi, P.M.B.; Zurek, W. Information distance. IEEE Trans. Inf. Theory 1998, 44, 1407–1423. [Google Scholar] [CrossRef]
- Li, M.; Chen, X.; Li, X.; Ma, B.; Vitányi, P.M.B. The similarity metric. IEEE Trans. Inf. Theory 2004, 50, 3250–3264. [Google Scholar] [CrossRef]
- Li, M. Information distance and its applications. Int. J. Found. Comput. Sci. 2011, 18, 1–9. [Google Scholar]
- Zhang, X.; Hao, Y.; Zhu, X.Y.; Li, M. New information distance measure and its application in question answering system. J. Comput. Sci. Technol. 2008, 23, 557–572. [Google Scholar] [CrossRef]
- Terwijn, S.A.; Torenvliet, L.; Vitányi, P.M.B. Nonapproximability of the normalized information distance. J. Comput. Syst. Sci. 2011, 77, 738–742. [Google Scholar] [CrossRef]
- Bloem, P.; Mota, F.; Rooij, S.D.; Antunes, L.; Adriaans, P. A safe approximation for Kolmogorov complexity. In Proceedings of the International Conference on Algorithmic Learning Theory, Bled, Slovenia, 8–10 October 2014; Auer, P., Clark, A., Zeugmann, T., Zilles, S., Eds.; Springer: Cham, Switzerland, 2014; Volume 8776, pp. 336–350. [Google Scholar]
- Cilibrasi, R.; Vitányi, P.M.B.; Wolf, R. Algorithmic clustering of music based on string compression. Comput. Music J. 2004, 28, 49–67. [Google Scholar] [CrossRef]
- Cuturi, M.; Vert, J.P. The context-tree kernel for strings. Neural Netw. 2005, 18, 1111–1123. [Google Scholar] [CrossRef] [PubMed]
- Cilibrasi, R.; Vitányi, P.M.B. Clustering by compression. IEEE Trans. Inf. Theory 2005, 51, 1523–1545. [Google Scholar] [CrossRef]
- Li, M.; Badger, J.; Chen, X.; Kwong, S.; Kearney, P.; Zhang, H. An information-based sequence distance and its application to whole mito chondrial genome phylogeny. Bioinformatics 2001, 17, 149–154. [Google Scholar] [CrossRef] [PubMed]
- Cilibrasi, R.L.; Vitányi, P.M.B. The Google similarity distance. IEEE Trans. Knowl. Data Eng. 2007, 19, 370–383. [Google Scholar] [CrossRef]
- Benedetto, D.; Caglioti, E.; Loreto, V. Language trees and zipping. Phys. Rev. Lett. 2002, 88, 048702. [Google Scholar] [CrossRef] [PubMed]
- Chen, X.; Francia, B.; Li, M.; McKinnon, B.; Seker, A. Shared information and program plagiarism detection. IEEE Trans. Inf. Theory 2004, 50, 1545–1551. [Google Scholar] [CrossRef]
- Bu, F.; Zhu, X.Y.; Li, M. A new multiword expression metric and its applications. J. Comput. Sci. Technol. 2011, 26, 3–13. [Google Scholar] [CrossRef]
- Leung-Yan-Cheong, S.K.; Cover, T.M. Some equivalences between Shannon entropy and Kolmogorov complexity. IEEE Trans. Inf. Theory 1978, 24, 331–339. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
- Li, M.; Vitányi, P.M.B. An Introduction to Kolmogorov Complexity and Its Applications, 3rd ed.; Springer: New York, NY, USA, 2008. [Google Scholar]
- Grünwald, P.; Vitányi, P. Shannon information and Kolmogorov complexity. arXiv 2008. [Google Scholar]
- Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy measures vs. Kolmogorov complexity. Entropy 2011, 13, 595–611. [Google Scholar] [CrossRef]
- Hammer, D.; Romashchenko, A.; Shen, A.; Vereshchagin, N. Inequalities for Shannon entropies and Kolmogorov complexities. J. Comput. Syst. Sci. 2000, 60, 442–464. [Google Scholar] [CrossRef]
- Pinto, A. Comparing notions of computational entropy. Theory Comput. Syst. 2009, 45, 944–962. [Google Scholar] [CrossRef]
- Antunes, L.; Laplante, S.; Pinto, A.; Salvador, L. Cryptographic Security of Individual Instances. In Information Theoretic Security; Springer: Berlin/Heidelberg, Germany, 2009; pp. 195–210. [Google Scholar]
- Kaced, T. Almost-perfect secret sharing. In Proceedings of the 2011 IEEE International Symposium on Information Theory Proceedings (ISIT), Saint Petersburg, Russia, 31 July–5 August 2011; pp. 1603–1607. [Google Scholar]
- Dai, S.; Guo, D. Comparing security notions of secret sharing schemes. Entropy 2015, 17, 1135–1145. [Google Scholar] [CrossRef]
- Bi, L.; Dai, S.; Hu, B. Normalized unconditional ϵ-security of private-key encryption. Entropy 2017, 19, 100. [Google Scholar] [CrossRef]
- Antunes, L.; Matos, A.; Pinto, A.; Souto, A.; Teixeira, A. One-way functions using algorithmic and classical information theories. Theory Comput. Syst. 2013, 52, 162–178. [Google Scholar] [CrossRef]
- Solomonoff, R. A formal theory of inductive inference—Part I. Inf. Control 1964, 7, 1–22. [Google Scholar] [CrossRef]
- Kolmogorov, A. Three approaches to the quantitative definition of information. Probl. Inf. Transm. 1965, 1, 1–7. [Google Scholar] [CrossRef]
- Chaitin, G. On the length of programs for computing finite binary sequences: Statistical considerations. J. ACM 1969, 16, 145–159. [Google Scholar] [CrossRef]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 623–656. [Google Scholar] [CrossRef]
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hu, B.; Bi, L.; Dai, S. Information Distances versus Entropy Metric. Entropy 2017, 19, 260. https://doi.org/10.3390/e19060260
Hu B, Bi L, Dai S. Information Distances versus Entropy Metric. Entropy. 2017; 19(6):260. https://doi.org/10.3390/e19060260
Chicago/Turabian StyleHu, Bo, Lvqing Bi, and Songsong Dai. 2017. "Information Distances versus Entropy Metric" Entropy 19, no. 6: 260. https://doi.org/10.3390/e19060260
APA StyleHu, B., Bi, L., & Dai, S. (2017). Information Distances versus Entropy Metric. Entropy, 19(6), 260. https://doi.org/10.3390/e19060260