Next Article in Journal
Thermoelectric Effects under Adiabatic Conditions
Next Article in Special Issue
Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
Previous Article in Journal
From Observable Behaviors to Structures of Interaction in Binary Games of Strategic Complements
Previous Article in Special Issue
The Measurement of Information Transmitted by a Neural Population: Promises and Challenges
Article Menu

Export Article

Open AccessArticle
Entropy 2013, 15(11), 4668-4699; doi:10.3390/e15114668

Estimating Functions of Distributions Defined over Spaces of Unknown Size

Santa Fe Institute, 1399 Hyde Park Rd., Santa Fe, NM 87501, USA
School of Informatics and Computing, Indiana University, 901 E 10th St, Bloomington, IN 47408, USA
Author to whom correspondence should be addressed.
Received: 3 August 2013 / Revised: 11 September 2013 / Accepted: 17 October 2013 / Published: 31 October 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
View Full-Text   |   Download PDF [459 KB, uploaded 24 February 2015]   |  


We consider Bayesian estimation of information-theoretic quantities from data, using a Dirichlet prior. Acknowledging the uncertainty of the event space size m and the Dirichlet prior’s concentration parameter c, we treat both as random variables set by a hyperprior. We show that the associated hyperprior, P(c, m), obeys a simple “Irrelevance of Unseen Variables” (IUV) desideratum iff P(c, m) = P(c)P(m). Thus, requiring IUV greatly reduces the number of degrees of freedom of the hyperprior. Some information-theoretic quantities can be expressed multiple ways, in terms of different event spaces, e.g., mutual information. With all hyperpriors (implicitly) used in earlier work, different choices of this event space lead to different posterior expected values of these information-theoretic quantities. We show that there is no such dependence on the choice of event space for a hyperprior that obeys IUV. We also derive a result that allows us to exploit IUV to greatly simplify calculations, like the posterior expected mutual information or posterior expected multi-information. We also use computer experiments to favorably compare an IUV-based estimator of entropy to three alternative methods in common use. We end by discussing how seemingly innocuous changes to the formalization of an estimation problem can substantially affect the resultant estimates of posterior expectations. View Full-Text
Keywords: Bayesian analysis; entropy; mutual information; variable number of bins; hidden variables; Dirichlet prior Bayesian analysis; entropy; mutual information; variable number of bins; hidden variables; Dirichlet prior

Figure 1

This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Wolpert, D.H.; DeDeo, S. Estimating Functions of Distributions Defined over Spaces of Unknown Size. Entropy 2013, 15, 4668-4699.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top