Next Article in Journal
Multiscale Sample Entropy of Cardiovascular Signals: Does the Choice between Fixed- or Varying-Tolerance among Scales Influence Its Evaluation and Interpretation?
Previous Article in Journal
The Mean Field Theories of Magnetism and Turbulence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Correction

Correction: Kolchinsky, A. and Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361

1
Santa Fe Institute, Santa Fe, NM 87501, USA
2
Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
*
Author to whom correspondence should be addressed.
Entropy 2017, 19(11), 588; https://doi.org/10.3390/e19110588
Submission received: 30 October 2017 / Accepted: 1 November 2017 / Published: 3 November 2017
Following the publication of our paper [1], we uncovered a mistake in the derivation of two formulas in the manuscript. This error does not affect any of the empirical results or conclusions of the article.
The following incorrect text on Page 9 should be replaced:
  • These bounds have particularly simple forms when all of the mixture components have equal covariance matrices, i.e., Σ i = Σ for all i. In this case, the lower bound of Equation (10) can be written as
    H ^ C α = d 2 i c i ln j c j p j ( μ i ) α ( 1 α ) .
    This is derived by combining the expressions for C α , Equation (14), the entropy of a Gaussian, Equation (13), and the Gaussian density function. For a homoscedastic mixture, the tightest lower bound among the Chernoff α -divergences is given by α = 0.5 , corresponding to the Bhattacharyya distance,
    H ^ BD = d 2 i c i ln j c j p j ( μ i ) 1 4 .
    (This is derived above in Section 3.2.)
The replacement text should read:
  • These bounds have simple forms when all of the mixture components have equal covariance matrices; i.e., Σ i = Σ for all i. First, define a transformation in which each Gaussian component p j is mapped to a different Gaussian p ˜ j , α , which has the same mean but where the covariance matrix is rescaled by 1 α ( 1 α ) ,
    p j : = N μ j , Σ p ˜ j , α : = N μ j , 1 α ( 1 α ) Σ .
    Then, the lower bound of Equation (10) can be written as
    H ^ C α = d 2 + d 2 ln α ( 1 α ) i c i ln j c j p ˜ j , α ( μ i ) .
    This is derived by combining the expressions for C α , Equation (14), the entropy of a Gaussian, Equation (13), and the Gaussian density function. For a homoscedastic mixture, the tightest lower bound among the Chernoff α -divergences is given by α = 0.5 , corresponding to the Bhattacharyya distance,
    H ^ BD = d 2 + d 2 ln 1 4 i c i ln j c j p ˜ j , 0.5 ( μ i ) .
    (This is derived above in Section 3.2.)

Reference

  1. Kolchinsky, A.; Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Kolchinsky, A.; Tracey, B.D. Correction: Kolchinsky, A. and Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361. Entropy 2017, 19, 588. https://doi.org/10.3390/e19110588

AMA Style

Kolchinsky A, Tracey BD. Correction: Kolchinsky, A. and Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361. Entropy. 2017; 19(11):588. https://doi.org/10.3390/e19110588

Chicago/Turabian Style

Kolchinsky, Artemy, and Brendan D. Tracey. 2017. "Correction: Kolchinsky, A. and Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361" Entropy 19, no. 11: 588. https://doi.org/10.3390/e19110588

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop