Next Article in Journal
Supervisors’ Visual Attention Allocation Modeling Using Hybrid Entropy
Previous Article in Journal
Editorial: Entropy in Networked Control
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On the Use of Entropy to Improve Model Selection Criteria

1
Consorzio RFX (CNR, ENEA, INFN, Universita’ di Padova, Acciaierie Venete SpA), 35127 Padova, Italy
2
Department of Industrial Engineering, University of Rome “Tor Vergata”, 00133 Roma, Italy
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(4), 394; https://doi.org/10.3390/e21040394
Submission received: 21 January 2019 / Revised: 31 March 2019 / Accepted: 10 April 2019 / Published: 12 April 2019
(This article belongs to the Section Signal and Data Analysis)

Abstract

The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) and the Akaike Information Criterion (AIC), are expressed in terms of synthetic indicators of the residual distribution: the variance and the mean-squared error of the residuals respectively. In many applications in science, the noise affecting the data can be expected to have a Gaussian distribution. Therefore, at the same level of variance and mean-squared error, models, whose residuals are more uniformly distributed, should be favoured. The degree of uniformity of the residuals can be quantified by the Shannon entropy. Including the Shannon entropy in the BIC and AIC expressions improves significantly these criteria. The better performances have been demonstrated empirically with a series of simulations for various classes of functions and for different levels and statistics of the noise. In presence of outliers, a better treatment of the errors, using the Geodesic Distance, has proved essential.
Keywords: Model Selection Criteria; Bayesian Information Criterion (BIC); Akaike Information Criterion (AIC); Shannon Entropy; Geodesic Distance Model Selection Criteria; Bayesian Information Criterion (BIC); Akaike Information Criterion (AIC); Shannon Entropy; Geodesic Distance

Share and Cite

MDPI and ACS Style

Murari, A.; Peluso, E.; Cianfrani, F.; Gaudio, P.; Lungaroni, M. On the Use of Entropy to Improve Model Selection Criteria. Entropy 2019, 21, 394. https://doi.org/10.3390/e21040394

AMA Style

Murari A, Peluso E, Cianfrani F, Gaudio P, Lungaroni M. On the Use of Entropy to Improve Model Selection Criteria. Entropy. 2019; 21(4):394. https://doi.org/10.3390/e21040394

Chicago/Turabian Style

Murari, Andrea, Emmanuele Peluso, Francesco Cianfrani, Pasquale Gaudio, and Michele Lungaroni. 2019. "On the Use of Entropy to Improve Model Selection Criteria" Entropy 21, no. 4: 394. https://doi.org/10.3390/e21040394

APA Style

Murari, A., Peluso, E., Cianfrani, F., Gaudio, P., & Lungaroni, M. (2019). On the Use of Entropy to Improve Model Selection Criteria. Entropy, 21(4), 394. https://doi.org/10.3390/e21040394

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop